You are here: Home Bernstein Seminar 2023 Thomas Nowotny: How to train …

Banner Bernstein Seminar

Thomas Nowotny: How to train Spiking Neural Networks efficiently

Head of AI Research Group CCNR | Sussex Neuroscience School of Engineering and Informatics | University of Sussex | UK [Bernstein Seminar]
When May 24, 2023
from 12:15 PM to 01:00 PM
Where Bernstein Center, Hansastr. 9a, Lecture Hall
Contact Name
Add event to calendar vCal
iCal

Abstract

As Moore’s law is ending and the computing requirements of AI are exploding faster than ever, many think that it is time to go back to the brain for further inspiration. One aspect of how brains work that has not been emulated in ANNs is that neurons communicate by spikes, i.e. all-or-none events that are emitted fairly sparsely, both in space and time. There is a growing community of researchers in neuromorphic computing who seek to use this principle to build new and much more efficient hardware for AI.  

But training spiking neural networks (SNNs) to solve machine learning tasks has been notoriously difficult. In this presentation, I will talk about the recently discovered EventProp algorithm by Pehle and Wunderlich [1] and how it can be efficiently implemented in our GPU-enhanced neural network (GeNN) simulation framework [2] to train recurrent SNNs. I will show results on a speech recognition task and then discuss some interesting issues with using stochastic gradient descent on exact gradients of SNNs, such as those provided by EventProp. More details of the work presented in this talk are in [3].  

[1] Wunderlich, T. C., & Pehle, C. (2021). Event-based backpropagation can compute exact gradients for spiking neural networks. Scientific Reports11(1), 12829.  

[2] Yavuz, Esin, James Turner, and Thomas Nowotny. "GeNN: a code generation framework for accelerated brain simulations." Scientific Reports 6, no. 1 (2016): 1-14. https://github.com/genn-team/genn   

[3] Nowotny, T., Turner, J. P., & Knight, J. C. (2022). Loss shaping enhances exact gradient learning with EventProp in Spiking Neural Networks. arXiv preprint arXiv:2212.01232.  

About the speaker and his research

Prof Thomas Nowotny has a background in theoretical physics. After his PhD from Leipzig University in 2001 he started working in Computational Neuroscience and bio-inspired AI at the Institute for non-linear Science at UCSD. He is now a Professor in Informatics at the University of Sussex and the head of the AI research group. His interests include olfaction, hybrid systems, spiking neural networks and their efficient simulation, bio-inspired AI and algorithms for neuromorphic computing. 

Hosted by Christian Leibold

 

All upcoming scientific events

Back to overview

All Bernstein Seminars

2024 |  202320222021 | 2020 | 2019 | 2018 | 2017 | 2016 | 2015 | 2014 | 2013 | 2012 | 2011 | 2010

Filed under: