Neurons in sensory cortex are tuned to diverse features in natural scenes. But what determines which features neurons become selective to? Here we explore the idea that neuronal selectivity is optimized to represent features in the recent sensory past that best predict immediate future inputs. We tested this hypothesis using simple feedforward neural networks, which were trained to predict the next few moments of video or audio in clips of natural scenes. The networks developed receptive fields that closely matched those of real cortical neurons in different mammalian species, including the oriented spatial tuning of primary visual cortex, the frequency selectivity of primary auditory cortex and, most notably, their temporal tuning properties. Furthermore, the better a network predicted future inputs the more closely its receptive fields resembled those in the brain. This suggests that sensory processing is optimized to extract those features with the most capacity to predict future input.
auditory, cortex, ferret, model, neuroscience, normative, prediction, Acoustic Stimulation, Animals, Anticipation, Psychological, Auditory Cortex, Computer Simulation, Mammals, Neural Networks (Computer), Photic Stimulation, Reaction Time, Sensory Receptor Cells, Video Recording, Visual Cortex