Sussex AI Seminar I Thomas Nowotny: How to train Spiking Neural Networks efficiently
By: Aleks Kossowska
Last updated: Wednesday, 27 March 2024
Prof. Thomas Nowotny is Professor of Informatics, Co-Director of Sussex AI.
Title: How to train Spiking Neural Networks efficiently
Abstract: As Moore’s law is ending and the computing requirements of AI are exploding faster than ever, many think that it is time to go back to the brain for further inspiration. One aspect of how brains work that has not been emulated in ANNs is that neurons communicate by spikes, i.e. all-or-none events that are emitted sparsely, both in space and time. There is a growing community of researchers in neuromorphic computing who seek to use this principle to build new and much more efficient hardware for AI. But training spiking neural networks (SNNs) to solve machine learning tasks has been notoriously difficult. In this presentation, I will talk about the recently discovered EventProp algorithm by Pehle and Wunderlich [1] and how it can be efficiently implemented in our GPU-enhanced neural network (GeNN) simulation framework [2] to train recurrent SNNs. I will show results on a speech recognition task and then discuss some interesting issues with using stochastic gradient descent on exact gradients of SNNs, such as those provided by EventProp.
More details of the work presented in this talk are in [3]. [1] Wunderlich, T. C., & Pehle, C. (2021). Event-based backpropagation can compute exact gradients for spiking neural networks. Scientific Reports, 11(1), 12829. [2] Yavuz, Esin, James Turner, and Thomas Nowotny. "GeNN: a code generation framework for accelerated brain simulations." Scientific Reports 6, no. 1 (2016): 1-14. https://github.com/genn-team/genn [3] Nowotny, T., Turner, J. P., & Knight, J. C. (2022). Loss shaping enhances exact gradient learning with EventProp in Spiking Neural Networks. arXiv preprint arXiv:2212.01232.
Watch the recording of seminar given 30/11/2023: