2023
DOI: 10.1109/tnnls.2021.3095724
|View full text |Cite
|
Sign up to set email alerts
|

A Tandem Learning Rule for Effective Training and Rapid Inference of Deep Spiking Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
61
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 100 publications
(62 citation statements)
references
References 46 publications
1
61
0
Order By: Relevance
“…Interestingly, our proposed CSNN with proxy learning could significantly outperform CSNNs with tandem learning rule [27,28]. This might be due to inconsistency between the forward and backward passes of tandem learning.…”
Section: Cifar10mentioning
confidence: 88%
See 3 more Smart Citations
“…Interestingly, our proposed CSNN with proxy learning could significantly outperform CSNNs with tandem learning rule [27,28]. This might be due to inconsistency between the forward and backward passes of tandem learning.…”
Section: Cifar10mentioning
confidence: 88%
“…To reach the optimal classification accuracy, they usually require at least several hundred of inference time steps. Tandem learning methods [27,28] could resolve these issues in conversion methods. Same as our proxy learn-ing, they assume that IF neurons with rate-code approximate artificial neurons with ReLU activation.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…• Backpropagation: this class of approaches try to overcome the non-differentiable nature of SNNs in order to enable spike-based backpropagation. Most approaches (a) try to approximate the derivatives to make gradient learning possible [25], (b) employ differentiable surrogate activations to replace the non-differentiable threshold operation [31] or (c) use a tandem approach where a SNN is performing the prediction and an equivalent standard NN is adapting the weights using backpropagation [32]. • Conversion: here the idea is to circumvent the backpropagation problem by training a standard ANN and reusing the trained weights for the SNN.…”
Section: Spiking Neural Network Designmentioning
confidence: 99%