The fast development of neuromorphic hardwares promotes Spiking Neural Networks (SNNs) to a thrilling research avenue. Current SNNs, though much efficient, are less effective compared with leading Artificial Neural Networks (ANNs) especially in supervised learning tasks. Recent efforts further demonstrate the potential of SNNs in supervised learning by introducing approximated backpropagation (BP) methods. To deal with the non-differentiable spike function in SNNs, these BP methods utilize information from the spatio-temporal domain to adjust the model parameters. With the increasing of time window and network size, the computational complexity of spatio-temporal backpropagation augments dramatically. In this paper, we propose a new backpropagation method for SNNs based on the accumulated spiking flow (ASF), i.e. ASF-BP. In the proposed ASF-BP method, updating parameters does not rely on the spike train of spiking neurons but leverage accumulated inputs and outputs of spiking neurons over the time window, which reduces the BP complexity significantly. We further present an adaptive linear estimation model to approach the dynamic characteristics of spiking neurons statistically. Experimental results demonstrate that with our proposed ASF-BP method, light-weight convolutional SNNs achieve superior performances compared with other spike-based BP methods on both non-neuromorphic (MNIST, CIFAR10) and neuromorphic (CIFAR10-DVS) datasets. The code is available at https://github.com/neural-lab/ASF-BP.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.