How can neural networks compute with few resources? A lesson from mice.
How can neural networks compute with few resources? A lesson from mice.
- Event time: 3:00pm until 4:00pm
- Event date: 24th February 2026
- Speaker: Professor Remi Monasson (Ecole Normale Supérieure, Paris)
Event details
Despite the unsustainable growth in energy consumption by artificial intelligence models and the recognition of the major role played by metabolic constraints in brain evolution, the relationship between computation and energy remains insufficiently studied and understood. Recently, Padamsey et al. experimentally investigated this relationship in the context of visual information processing in food-deprived mice. Combining analysis of their activity data and modeling inspired by statistical physics, in particular some variants of the Hopfield model, I will propose some mechanism by which neural circuits can spare considerable energy with little impact on their performance.
Event resources
About Statistical Physics and Complexity Group meetings
This is a weekly series of webinars on theoretical aspects of Condensed Matter, Biological, and Statistical Physics. It is open to anyone interested in research in these areas..
Find out more about Statistical Physics and Complexity Group meetings.
