This fun project started as a summer student project at Telluride 2021 led and advised by Chiara Bartolozzi and myself. An incredibly motivated group of summer students taught a robotic fingertip with tactile sensors andContinue reading
Tag: spiking neural networks
Announcing SNUFA 2022
We are happy to announce SNUFA 2022, an online workshop focused on research advances in the field of “Spiking Networks as Universal Function Approximators.” SNUFA 2022 will take place online 9-10 November 2022, European afternoons.Continue reading
Fluctuation-driven initialization for spiking neural network training
Surrogate gradients are a great tool for training spiking neural networks in computational neuroscience and neuromorphic engineering, but what is a good initialization? In our new preprint co-led by Julian and Julia, we lay outContinue reading
Story on Spiking Neural Networks
Here is a quick shout-out to this nice news story on spiking nets and surrogate gradients by Anil Ananthaswamy, the Science Communicator in Residence at the Simons Institute for the Theory of Computing in Berkeley.Continue reading
Announcing the SNUFA 2021 workshop
I am stoked about our second edition of our successful SNUFA workshop on “spiking neural networks as universal function approximators,” on 2-3 November 2021 (we shifted from the original date a week later to avoidContinue reading
Paper: Brain-Inspired Learning on Neuromorphic Substrates
I’m happy to share our new overview paper (https://ieeexplore.ieee.org/document/9317744, preprint: arxiv.org/abs/2010.11931) on brain-inspired learning on neuromorphic substrates in (spiking) recurrent neural networks. We systematically analyze how the combination of Real-Time-Recurrent Learning (RTRL; Williams and Zipser,Continue reading
Hiring: Information processing in spiking neural networks
We are looking for Ph.D. students to work on the computational principles of information processing in spiking neural networks. The project strives to understand computation in the sparse spiking and sparse connectivity regime, in whichContinue reading
Online workshop: Spiking neural networks as universal function approximators
Dan Goodman and myself are organizing an online workshop on new approaches to training spiking neural networks, Aug 31st / Sep 1st 2020. Invited speakers: Sander Bohte (CWI), Iulia M. Comsa (Google), Franz Scherr (TUG), Emre Neftci (UC Irvine),Continue reading
Preprint: The remarkable robustness of surrogate gradient learning for instilling complex function in spiking neural networks
We just put up a new preprint https://www.biorxiv.org/content/10.1101/2020.06.29.176925v1 in which we take a careful look at what makes surrogate gradients work. Spiking neural networks are notoriously hard to train using gradient-based methods due to theirContinue reading
Preprint: The remarkable robustness of surrogate gradient learning for instilling complex function in spiking neural networks
We just put up a new preprint https://www.biorxiv.org/content/10.1101/2020.06.29.176925v1 in which we take a careful look at what makes surrogate gradients work. Spiking neural networks are notoriously hard to train using gradient-based methods due to theirContinue reading