Integration of Visual Information in Auditory Cortex Promotes Auditory Scene Analysis through Multisensory Binding
Atilgan H., Town SM., Wood KC., Jones GP., Maddox RK., Lee AKC., Bizley JK.
How and where in the brain audio-visual signals are bound to create multimodal objects remains unknown. One hypothesis is that temporal coherence between dynamic multisensory signals provides a mechanism for binding stimulus features across sensory modalities. Here, we report that when the luminance of a visual stimulus is temporally coherent with the amplitude fluctuations of one sound in a mixture, the representation of that sound is enhanced in auditory cortex. Critically, this enhancement extends to include both binding and non-binding features of the sound. We demonstrate that visual information conveyed from visual cortex via the phase of the local field potential is combined with auditory information within auditory cortex. These data provide evidence that early cross-sensory binding provides a bottom-up mechanism for the formation of cross-sensory objects and that one role for multisensory binding in auditory cortex is to support auditory scene analysis. Atilgan et al. demonstrate that temporal coherence between auditory and visual stimuli shapes the representation of a sound scene in auditory cortex. Auditory cortex is identified as a site of multisensory binding, with inputs from visual cortex underpinning these effects.