Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

How and where in the brain audio-visual signals are bound to create multimodal objects remains unknown. One hypothesis is that temporal coherence between dynamic multisensory signals provides a mechanism for binding stimulus features across sensory modalities. Here, we report that when the luminance of a visual stimulus is temporally coherent with the amplitude fluctuations of one sound in a mixture, the representation of that sound is enhanced in auditory cortex. Critically, this enhancement extends to include both binding and non-binding features of the sound. We demonstrate that visual information conveyed from visual cortex via the phase of the local field potential is combined with auditory information within auditory cortex. These data provide evidence that early cross-sensory binding provides a bottom-up mechanism for the formation of cross-sensory objects and that one role for multisensory binding in auditory cortex is to support auditory scene analysis.

More information Original publication

DOI

10.1016/j.neuron.2017.12.034

Type

Journal article

Publication Date

2018-02-07T00:00:00+00:00

Volume

97

Pages

640 - 655.e4

Keywords

attention, auditory cortex, auditory-visual, binding, cross-modal, ferret, multisensory, sensory cortex, visual cortex, Acoustic Stimulation, Action Potentials, Animals, Auditory Perception, Female, Ferrets, Neurons, Photic Stimulation, Visual Cortex, Visual Perception