Research

The overall aim of our research is to understand the principles that underlie memory formation and information processing in biological neural networks. To that end, we build neural network models with experience-dependent plasticity and study how specific function can emerge through the orchestrated interplay of different plasticity mechanisms. Topics of interest include, for instance:

  • Neural networks (both spiking and non-spiking)
  • Synaptic plasticity, circuit motifs, and homeostasis
  • Predictive and self-supervised learning
  • Biologically plausible deep credit assignment
  • Complex synapses for continual learning

Approach

Our modeling efforts focus on how both the structure and function of neural networks are shaped by plasticity. Specifically, we take a three-fold approach, which combines simulations, theory, and data analysis. First, to simulate large rate-coding and spiking neural networks with plasticity, we rely on high-performance computing and machine learning techniques. Second, to interpret and understand the dynamics in our models, we employ a variety of analytical tools from dynamical systems, control theory and statistical physics. Finally, to compare the high-dimensional dynamics of models with neurobiological data, we work closely with our experimental colleagues on the development and the application of practical dimensionality reduction techniques.