Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

This September, Dr Rui Ponte Costa relocates his Neural and Machine Learning group to DPAG. He was previously a Senior Lecturer in Computational Neuroscience and Machine Learning at the University of Bristol. His research group focuses on producing AI-driven brain-wide computational models of learning.

Rui Ponte CostaDr Rui Ponte Costa began his research career as a PhD student at the University of Edinburgh as part of the Institute for Adaptive and Neural Computation. He established a collaboration with the labs of Professor Mark van Rossum and Professor Jesper Sjostrom at UCL to develop computational models of learning and synaptic plasticity in cortical circuits. 

His investigations progressed to connecting synaptic plasticity with learning principles as a postdoctoral research scientist, focusing on a machine learning-inspired approach to cortical plasticity. In 2014, he undertook his first postdoctoral position with Associate Professor Tim Vogels at the Centre for Neural Circuits and Behaviour (CNCB) at DPAG. Inspired by statistical machine learning principles, together they introduced a unifying model of long-term synaptic plasticity, which, in collaboration with New York University's Professor Rob Froemke and Professor Nigel Emptage of the Department of Pharmacology, University of Oxford, demonstrated consistency with a range of experimental observations while proposing a solution to a decade-long debate of synaptic plasticity in 2017. In the same year, Dr Costa established a collaboration with Professor Nando de Freitas of the University of Oxford's Department of Computer Science and Google Deepmind to propose the first mapping between artificial recurrent neural networks and cortical circuits.

Dr Costa's next step was to make more explicit links with biologically realistic neural learning. To do so, he undertook a postdoctoral project with Professor Walter Senn of Universität Bern and Professor Yoshua Bengio, Director of Mila – Quebec AI Institute. Dr Costa helped to develop a new generation of models of hierarchical learning in the cortex, culminating in papers published in 2018 and 2019.

In 2018, Dr Costa established his independent laboratory at the University of Bristol. Dr Costa said: "I decided to zoom out and focus on understanding how a given behavioural outcome ultimately leads to credit being assigned to trillions of synapses across multiple brain areas – a credit assignment problem. We believe that in order to have a unified understanding of how we learn to produce adaptable behaviours it is important to jointly study the contribution of three different systems: cortical circuits, neuromodulation and subcortical regions."

Dr Costa's Neural and Machine Learning group has so far made three major breakthroughs. Firstly, the lab has proposed some of the first solutions of how cortical circuits may learn complex tasks by approximating deep learning algorithms. Secondly, the team has shown how the cerebellum (mini-brain) can play a critical role in helping the cerebrum (big-brain) quickly adapt to the environment through papers published in 2022 and 2023. Finally, they have introduced a AI-driven theory of cholinergic neuromodulation that explains brain learning speed-up and its role in cognitive decline. According to this theory, the cholinergic system continuously shifts the focus of learning in the cortex.

Dr Costa is presently relocating his group to the home of his first postdoctoral position - the CNCB. According to Dr Costa: "At Oxford we are aiming to continue our work in understanding the brain-wide principles of learning. We hope to continue developing a new generation of computational models inspired by deep learning in close collaboration with experimentalists."