2019
DOI: 10.1109/msp.2019.2931595
|View full text |Cite
|
Sign up to set email alerts
|

Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
530
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 826 publications
(533 citation statements)
references
References 26 publications
2
530
0
1
Order By: Relevance
“…To that end, we computed it using the derivative of the hard threshold nonlinearity of the spikes. As expected, the hard threshold nonlinearity prevented gradient flow into the hidden layer [24], and consequently lead to poor performance (Fig. 2b,c).…”
Section: Resultssupporting
confidence: 75%
See 4 more Smart Citations
“…To that end, we computed it using the derivative of the hard threshold nonlinearity of the spikes. As expected, the hard threshold nonlinearity prevented gradient flow into the hidden layer [24], and consequently lead to poor performance (Fig. 2b,c).…”
Section: Resultssupporting
confidence: 75%
“…Second, by constraining their activity through regularization, we showed that surrogate gradients can produce SNNs capable of efficient information processing with sparse spiking activity. Surrogate gradients have been used by a number of studies to train SNNs [24], to solve small-scale toy problems with fractionally predictive neurons [43], to train convolutional SNNs on challenging neuromorphic [44,45] and vision benchmarks [19], or to train recurrent SNNs on temporal problems requiring working memory [21,22]. These studies used a range of different surrogate derivatives ranging from exponential [21], piece-wise linear [22], or tanh [27], sometimes with a non-standard neuron model with a constant leak term [19], but due to the different function choices and datasets they are not easily comparable.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations