Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Neurons in sensory cortex are tuned to diverse features in natural scenes. But what determines which features neurons become selective to? Here we explore the idea that neuronal selectivity is optimized to represent features in the recent sensory past that best predict immediate future inputs. We tested this hypothesis using simple feedforward neural networks, which were trained to predict the next few moments of video or audio in clips of natural scenes. The networks developed receptive fields that closely matched those of real cortical neurons in different mammalian species, including the oriented spatial tuning of primary visual cortex, the frequency selectivity of primary auditory cortex and, most notably, their temporal tuning properties. Furthermore, the better a network predicted future inputs the more closely its receptive fields resembled those in the brain. This suggests that sensory processing is optimized to extract those features with the most capacity to predict future input.

Original publication

DOI

10.7554/eLife.31557

Type

Journal article

Journal

Elife

Publication Date

18/06/2018

Volume

7

Keywords

auditory, cortex, ferret, model, neuroscience, normative, prediction, Acoustic Stimulation, Animals, Anticipation, Psychological, Auditory Cortex, Computer Simulation, Mammals, Neural Networks, Computer, Photic Stimulation, Reaction Time, Sensory Receptor Cells, Video Recording, Visual Cortex