2023
DOI: 10.3389/fnins.2023.1209795
|View full text |Cite
|
Sign up to set email alerts
|

Direct learning-based deep spiking neural networks: a review

Abstract: The spiking neural network (SNN), as a promising brain-inspired computational model with binary spike information transmission mechanism, rich spatially-temporal dynamics, and event-driven characteristics, has received extensive attention. However, its intricately discontinuous spike mechanism brings difficulty to the optimization of the deep SNN. Since the surrogate gradient method can greatly mitigate the optimization difficulty and shows great potential in directly training deep SNNs, a variety of direct le… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(3 citation statements)
references
References 104 publications
0
3
0
Order By: Relevance
“…SNNs, inspired by biological neurons, excel in energy efficiency and event-driven computation, making them ideal for neuromorphic computing and real-time applications. SNNs offer promising applications due to their resemblance to biological brains and unique computational capabilities [ 157 ]. Particularly in neuromorphic engineering, SNNs closely mimic biological brain functions, making them ideal for tasks like sensory processing, pattern recognition, and motor control in robots and autonomous systems [ 158 , 159 ].…”
Section: Discussion On Pnns and Concluding Remarksmentioning
confidence: 99%
“…SNNs, inspired by biological neurons, excel in energy efficiency and event-driven computation, making them ideal for neuromorphic computing and real-time applications. SNNs offer promising applications due to their resemblance to biological brains and unique computational capabilities [ 157 ]. Particularly in neuromorphic engineering, SNNs closely mimic biological brain functions, making them ideal for tasks like sensory processing, pattern recognition, and motor control in robots and autonomous systems [ 158 , 159 ].…”
Section: Discussion On Pnns and Concluding Remarksmentioning
confidence: 99%
“…Such methods (Wu et al, 2018 ; Neftci et al, 2019 ; Lee et al, 2020 ) usually obtain low latency and can train SNNs with small simulation time steps. Recently, in addition to designing powerful spiking neurons (Fang et al, 2021b ; Li et al, 2022 ; Yao et al, 2022 ), researchers have primarily improved the accuracy of direct-training SNNs from network structure and training techniques (Guo et al, 2023 ). We will elaborate on them.…”
Section: Related Workmentioning
confidence: 99%
“…In order to facilitate effective training of SNNs, researchers have proposed various methods [ 7 , 8 , 9 ]. Present research predominantly concentrates on three major aspects: pretraining through clustering and autoencoding, among other methods, under unsupervised learning; enhancing performance by combining supervised information and unlabeled data under semisupervised learning; and implementing backpropagation under supervised learning using alternative differentiable activation functions or other techniques.…”
Section: Introductionmentioning
confidence: 99%