Talk in Oxford on Rapid Compensatory Processes

I am delighted to get the chance to present my work on learning in spiking neural networks next week (Tuesday, 17 October 2017, 1pm to 2pm) in Oxford at the “EP Cognitive and Behavioural Neuroscience Seminar”.

Title: Making Cell Assemblies: What can we learn about plasticity from spiking neural network models?

Abstract: Long-term synaptic changes are thought to underlie learning and memory. Hebbian plasticity and homeostatic plasticity work in concert to combine neurons into functional cell assemblies. This is the story you know. In this talk, I will tell a different tale. In the first part, starting from the iconic notion of the Hebbian cell assembly, I will show the difficulties that synaptic plasticity has to overcome to form and maintain memories stored as cell assemblies in a network model of spiking neurons. Teetering on the brink of disaster, a diversity of synaptic plasticity mechanisms must work in symphony to avoid exploding network activity and catastrophic memory loss – in order to fulfill our preconception of how memories are formed and maintained in biological neural networks. I will introduce the notion of Rapid Compensatory Processes, explain why they have to work on shorter timescales than currently known forms of homeostatic plasticity, and motivate why it is useful to derive synaptic learning rules from a cost function approach. Cost functions will also serve as the motivation for the second part of my talk in which I will focus on the issue of spatial credit assignment. Plastic synapses encounter this issue when they are part of a network in which information is processed sequentially over several layers. I will introduce several recent conceptual advances in the field that have lead to algorithms which can train spiking neural network models capable of solving complex tasks. Finally, I will show that such algorithms can be mapped to voltage-dependent three-factor Hebbian plasticity rules and discuss their biological plausibility.

Leave a Reply