The overall aim of our research is to understand the principles that underlie memory formation and information processing in biological neural networks. To that end, we build neural network models with experience-dependent plasticity and study how specific function can emerge through the orchestrated interplay of different plasticity mechanisms.
Selected areas of interest
- Functional spiking neural networks and neuromorphic learning algorithms through surrogate gradients
- Inhibitory microcircuits and predictive processing
- Learning in neural networks through interacting forms of plasticity
- Role of internal synaptic dynamics for memory consolidation and continual learning
Our modeling efforts focus on how both the structure and function of neural networks are shaped by plasticity. Specifically, we take a three-fold approach, which combines simulations, theory, and data analysis. First, to simulate large rate-coding and spiking neural networks with plasticity, we rely on high-performance computing and machine learning techniques. Second, to interpret and understand the dynamics in our models, we employ a variety of analytical tools from dynamical systems, control theory and statistical physics. Finally, to compare the high-dimensional dynamics of models with neurobiological data, we work closely with our experimental colleagues on the development and the application of practical dimensionality reduction techniques.