2023
DOI: 10.3389/fnins.2023.1141701
|View full text |Cite
|
Sign up to set email alerts
|

High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron

Abstract: Spiking neural networks (SNNs) have attracted intensive attention due to the efficient event-driven computing paradigm. Among SNN training methods, the ANN-to-SNN conversion is usually regarded to achieve state-of-the-art recognition accuracies. However, many existing ANN-to-SNN techniques impose lengthy post-conversion steps like threshold balancing and weight renormalization, to compensate for the inherent behavioral discrepancy between artificial and spiking neurons. In addition, they require a long tempora… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(7 citation statements)
references
References 32 publications
(78 reference statements)
0
7
0
Order By: Relevance
“…59 Fang et al (2021) [62] 128c3-p2-128c3-p2-2048-100-10 8 99.60 FELL (2023) [63] 128c3-p2-128c3-p2-2048-100-10 10 99.38 BELL (2023) [63] 128c3-p2-128c3-p2-2048-100-10 10 99.35 ELL (2023) [63] 128c3-p2-128c3-p2-2048-100-10 10 99. 46 Gao et al (2023) [64] VGG-9 128 99.62 Ours 128c3-p2-128c3-p2-2048-100-10 8 99.67…”
Section: Work Comparison and Discussionmentioning
confidence: 95%
See 2 more Smart Citations
“…59 Fang et al (2021) [62] 128c3-p2-128c3-p2-2048-100-10 8 99.60 FELL (2023) [63] 128c3-p2-128c3-p2-2048-100-10 10 99.38 BELL (2023) [63] 128c3-p2-128c3-p2-2048-100-10 10 99.35 ELL (2023) [63] 128c3-p2-128c3-p2-2048-100-10 10 99. 46 Gao et al (2023) [64] VGG-9 128 99.62 Ours 128c3-p2-128c3-p2-2048-100-10 8 99.67…”
Section: Work Comparison and Discussionmentioning
confidence: 95%
“…Ma et al’s three methods [ 63 ], FELL, BELL, and ELL, all emphasized spike learning based on local classifiers, indicating the effectiveness of local learning in deep networks. Gao et al [ 64 ] adopted the VGG-9 structure and employed a quantized training framework for the conversion from deep ANNs to SNNs. While this approach is technically noteworthy, it required significantly more time steps than our method.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Note that the CNN layers used for feature-reduction purposes can be converted into SNN layers with various methods as shown in many recent studies [7,[38][39][40][41], or trained using backpropagation through time [42,43] which opens up the potential for adopting the entire feature-reduction in a low-power neuromorphic setting. In this work, we focus on the CNN-SNN conversion method and train it by backpropagation, without unrolling through time, considering a rate-coded network.…”
Section: Deep Clustering With K-means Algorithmmentioning
confidence: 99%
“…Therefore, many researches combine various mathematical optimization techniques with spiking neural networks, trying to propose a new learning paradigm suitable for SNNs. For example, Xing et al adopted the ANN-to-SNN strategy to migrate the parameters of the trained ANNs to the SNNs of the same architecture (Xing et al, 2019 ; Gao et al, 2023 ). Anwar et al applied reinforcement learning to spiking neural networks to perform specific tasks, such as Pong and Cartpole game playing (Bellec et al, 2020 ; Anwar et al, 2022 ; Haşegan et al, 2022 ).…”
Section: Related Workmentioning
confidence: 99%