2021
DOI: 10.1609/aaai.v35i12.17236
|View full text |Cite
|
Sign up to set email alerts
|

Training Spiking Neural Networks with Accumulated Spiking Flow

Abstract: The fast development of neuromorphic hardwares promotes Spiking Neural Networks (SNNs) to a thrilling research avenue. Current SNNs, though much efficient, are less effective compared with leading Artificial Neural Networks (ANNs) especially in supervised learning tasks. Recent efforts further demonstrate the potential of SNNs in supervised learning by introducing approximated backpropagation (BP) methods. To deal with the non-differentiable spike function in SNNs, these BP methods utilize information from t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 38 publications
(19 citation statements)
references
References 25 publications
1
9
0
Order By: Relevance
“…Networks trained with this approach reach accuracy comparable with traditional DNN models, but suitable for implementation in energy-efficient neuromorphic computing. Results close to the latter are achieved in other works in which similar approaches are taken [169,217,218,220,221,227], where similar rate coding schemes are used for the output, but the spiking differentiation problem is solved by using smoothing kernels for the spikes. Specific approaches address the problem of achieving fast inference, in addition to accurate predictions, which can be obtained by leveraging TTFS coding, as in [159,228].…”
Section: Backprop Approaches For Snn Trainingsupporting
confidence: 64%
“…Networks trained with this approach reach accuracy comparable with traditional DNN models, but suitable for implementation in energy-efficient neuromorphic computing. Results close to the latter are achieved in other works in which similar approaches are taken [169,217,218,220,221,227], where similar rate coding schemes are used for the output, but the spiking differentiation problem is solved by using smoothing kernels for the spikes. Specific approaches address the problem of achieving fast inference, in addition to accurate predictions, which can be obtained by leveraging TTFS coding, as in [159,228].…”
Section: Backprop Approaches For Snn Trainingsupporting
confidence: 64%
“…While these techniques are generally effective, they require a multitude of timesteps to emulate float activations using binary spikes. A set of recent studies have suggested the use of surrogate functions to circumvent this non-differentiable backpropagation issue (Lee et al, 2016 , 2020 ; Shrestha and Orchard, 2018 ; Wu et al, 2018 , 2020 , 2021 ; Neftci et al, 2019 ; Li et al, 2021b ; Kim et al, 2022a ). These methods, accounting for temporal dynamics during weight training, exhibit high performance and short latency.…”
Section: Related Workmentioning
confidence: 99%
“…Furthermore, the conversion method is not suitable for neuromorphic data. Some gradient-based direct training methods find the equivalence between spike representations (e.g., firing rates or first spike times) of SNNs and some differentiable mappings or fixedpoint equations [40,43,58,61,62,66,67,69,75]. Then the spike-representation-based methods train SNNs by gradients calculated from the corresponding mappings or fixedpoint equations.…”
Section: Related Workmentioning
confidence: 99%