2019
DOI: 10.3389/fnins.2018.00987
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Computation in Adaptive Spiking Neural Networks

Abstract: Artificial Neural Networks (ANNs) are bio-inspired models of neural computation that have proven highly effective. Still, ANNs lack a natural notion of time, and neural units in ANNs exchange analog values in a frame-based manner, a computationally and energetically inefficient form of communication. This contrasts sharply with biological neurons that communicate sparingly and efficiently using isomorphic binary spikes. While Spiking Neural Networks (SNNs) can be constructed by replacing the units of an ANN wi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
61
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 48 publications
(61 citation statements)
references
References 36 publications
0
61
0
Order By: Relevance
“…Firstly, numerous adaptations to the discontinuous dynamics of SNNs have recently been proposed for learning temporally precise spike patterns [37], [38], [39], [40]. Alternatively, due to the popularity of this method in ANNs, SNNs commonly rely on transferring optimization results from their non-spiking counterparts [41], [42], [43]. In both cases, high accuracy levels are reported in image classification tasks, but still far below those obtained with conventional ANNs.…”
Section: Synaptic Plasticitymentioning
confidence: 99%
“…Firstly, numerous adaptations to the discontinuous dynamics of SNNs have recently been proposed for learning temporally precise spike patterns [37], [38], [39], [40]. Alternatively, due to the popularity of this method in ANNs, SNNs commonly rely on transferring optimization results from their non-spiking counterparts [41], [42], [43]. In both cases, high accuracy levels are reported in image classification tasks, but still far below those obtained with conventional ANNs.…”
Section: Synaptic Plasticitymentioning
confidence: 99%
“…neuromorphic platforms [22] [23] [24] [25] [26]. The third category tries to modify the available deep learning algorithms and directly train an SNN with error back-propagation [27] [28] [29] [30] [31].…”
Section: * Amirreza Yousefzadeh and Mina A Khoei Contributed Equally mentioning
confidence: 99%
“…In this method, the output of an ANN unit is mapped to the average firing rate of a spiking neuron [24]. To reduce the amount of firing and synaptic operations in this coding scheme, some of the hyper-parameters of the SNN (like leak rate and threshold of neurons) can be optimized before [23] [24] or even during [26] inference. Even-though using frequency of firing as the neuron output makes the conversion mathematically exact, since the equations are based on the average firing rate of the neurons, many spikes per neuron are needed for the spiking network to become stable.…”
Section: * Amirreza Yousefzadeh and Mina A Khoei Contributed Equally mentioning
confidence: 99%
“…For example, Bohte (2012) proposed a multiplicative Adaptive Spike Response Model which can achieve a high coding efficiency and maintain the coding efficiency over changes in the dynamic signal range of several orders of magnitude. In Zambrano and Bohte (2016) and Zambrano et al (2017), author proposed an Adapting Spiking Neural Network (ASNN) based on adaptive spiking neurons which can use an order of magnitude fewer spikes to get a good performance. In Zambrano et al (2019) and O'Connor et al (2017), they use the speed of adaptation and the effective spike height to control the precision of the spikebased neural coding.…”
Section: Introductionmentioning
confidence: 99%