Skip to main content
padlock icon - secure page this page is secure

Open Access Individual differences in multisensory integration and timing

Download Article:
 Download
(PDF 866.4 kb)
 
The senses have traditionally been studied separately, but it is now recognised that the brain is just as richly multisensory as is our natural environment. This creates fresh challenges for understanding how complex multisensory information is organised and coordinated around the brain. Take timing for example: the sight and sound of a person speaking or a ball bouncing may seem simultaneous, but their neural signals from each modality arrive at different multisensory areas in the brain at different times. How do we nevertheless perceive the synchrony of the original events correctly? It is popularly assumed that this is achieved via some mechanism of multisensory temporal recalibration. But recent work from my lab on normal and pathological individual differences show that sight and sound are nevertheless markedly out of synch by different amounts for each individual and even for different tasks performed by the same individual. Indeed, the more an individual perceive the same multisensory event as having an auditory lead and an auditory lag at the same time. This evidence of apparent temporal disunity sheds new light on the deep problem of understanding how neural timing relates to perceptual timing of multisensory events. It also leads to concrete therapeutic applications: for example, we may now be able to improve an individual’s speech comprehension by simply delaying sound or vision to compensate for their individual perceptual asynchrony.
No References for this article.
No Supplementary Data.
No Article Media
No Metrics

Document Type: Research Article

Publication date: February 14, 2016

More about this publication?
  • For more than 30 years, the Electronic Imaging Symposium has been serving those in the broad community - from academia and industry - who work on imaging science and digital technologies. The breadth of the Symposium covers the entire imaging science ecosystem, from capture (sensors, camera) through image processing (image quality, color and appearance) to how we and our surrogate machines see and interpret images. Applications covered include augmented reality, autonomous vehicles, machine vision, data analysis, digital and mobile photography, security, virtual reality, and human vision. IS&T began sole sponsorship of the meeting in 2016. All papers presented at EIs 20+ conferences are open access.

    Please note: For purposes of its Digital Library content, IS&T defines Open Access as papers that will be downloadable in their entirety for free in perpetuity. Copyright restrictions on papers vary; see individual paper for details.

  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
X
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more