Mark the dates September 25th-26th for our Bernstein Satellite Workshop on “Networks which do stuff” which Guillaume Hennequin, Tim Vogels and myself are organizing this year at the Bernstein meeting in Berlin.
Computation in the brain occurs through complex interactions in highly structured, non-random networks. Moving beyond traditional approaches based on statistical physics, engineering-based approaches are bringing new vistas on circuit computation, by providing novel ways of i) building artificial yet fully functional model circuits, ii) dissecting their dynamics to identify new circuit mechanisms, and iii) reasoning about population recordings made in diverse brain areas across a range of sensory, motor, and cognitive tasks. Thus, the same “science of real-world problems” that is behind the accumulation of increasingly rich neural datasets is now also being recognized as a vast and useful set of tools for their analysis.
This workshop aims at bringing together researchers who build and study structured network models, spiking or otherwise, that serve specific functions. Our speakers will present their neuroscientific work at the confluence of machine learning, optimization, control theory, dynamical systems, and other engineering fields, to help us understand these recent developments, critically evaluate their scope and limitations, and discuss their use for elucidating the neural basis of intelligent behaviour.
Date and venue
September 25, 2018, 2:00 – 6:30 pm
September 26, 2018, 8:30 am – 12:30 pm
|Tue, Sept 25, 2018|
|14:00||Nataliya Kraynyukova, MPI for Brain Research, Frankfurt a.M., Germany
Stabilized supralinear network can give rise to bistable, oscillatory, and persistent activity
|14:40||Jake Stroud, University of Oxford, UK
Spatio-temporal control of recurrent cortical activity through gain modulation
|15:20||Jorge Mejias, University of Amsterdam, The Netherlands
Balanced amplification of signals propagating across large-scale brain networks
|16:30||Srdjan Ostojic, Ecole normale supérieure, Paris, France
Reverse-engineering computations in recurrent neural networks
|17:10||Chris Stock, Stanford University, USA
Reverse engineering transient computations in nonlinear recurrent neural networks through model reduction
|17:50||Guillaume Hennequin, University of Cambridge, UK
Flexible, optimal motor control in a thalamo-cortical circuit model
|Wed, Sep 26, 2018|
|08:30||Aditya Gilra, University of Bonn, Germany
Local stable learning of forward and inverse dynamics in spiking neural networks
|09:10||Robert Gütig, MPI for Experimental Medicine Göttingen, Germany
Margin learning in spiking neurons
|09:50||Claudia Clopath, Imperial College London, UK
Training spiking recurrent networks
|11:00||Friedemann Zenke, University of Oxford, UK
Training deep spiking neural networks with surrogate gradients
|11:40||Christian Marton, Imperial College London, UK
Task representation & learning in prefrontal cortex & striatum as a dynamical system
More details here and general information here.