Preprint: Training spiking multi-layer networks with surrogate gradients on an analog neuromorphic substrate

We are happy to share our new preprint “Training spiking multi-layer networks with surrogate gradients on an analog neuromorphic substrate” (https://arxiv.org/abs/2006.07239). This work led by Benjamin Cramer and Sebastian Billaudelle and is a joint effort with Johannes Schemmel and colleagues of the Kirchhoff-Institute for Physics at the University of Heidelberg.

Spiking neurons are the basic units underlying information processing in the brain. To that end, neurons integrate inputs in an analog manner, whereas they communicate their outputs through temporally sparse digital events, a.k.a. spikes.

A BrainScaleS-2 neuromorphic core with analog neurons. Courtesy by KIP, Heidelberg.

The mixed-signal character of neural information processing can be implemented efficiently in analog neuromorphic hardware. However, to mirror the power of biological information processing, we also have to instantiate functional network connectivity, a difficult algorithmic problem.

In this work, we used the BrainScaleS-2 single chip system as a substrate to simulate the forward pass of a spiking neural network. We then computed surrogate gradients in software to update the hardware weights allowing us to sidestep the issue of device mismatch.

This hardware-in-the-loop approach effectively allowed us to train spiking neural networks to process temporally encoded inputs with only a limited number of spikes. In inference mode, our networks can process on the order of 70k inputs with a <300mW power budget.