2023
DOI: 10.3390/biomimetics8040375
|View full text |Cite
|
Sign up to set email alerts
|

IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation

Xiongfei Fan,
Hong Zhang,
Yu Zhang

Abstract: Spiking neural networks (SNNs) are widely recognized for their biomimetic and efficient computing features. They utilize spikes to encode and transmit information. Despite the many advantages of SNNs, they suffer from the problems of low accuracy and large inference latency, which are, respectively, caused by the direct training and conversion from artificial neural network (ANN) training methods. Aiming to address these limitations, we propose a novel training pipeline (called IDSNN) based on parameter initia… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 30 publications
0
4
0
Order By: Relevance
“…Wang et al (Wang et al, 2022d) proposed a time-domain belief distribution strategy for back-propagation of spiking neural networks, and used GPU for accelerated operations. Fan et al (Fan et al, 2023) designed a deep spiking neural network for classification tasks, and performed supervised learning and accelerated operations on the deep learning open-source platform. In addition, some researchers have also tried using spiking neural networks in complex visual tasks.…”
Section: Spiking Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…Wang et al (Wang et al, 2022d) proposed a time-domain belief distribution strategy for back-propagation of spiking neural networks, and used GPU for accelerated operations. Fan et al (Fan et al, 2023) designed a deep spiking neural network for classification tasks, and performed supervised learning and accelerated operations on the deep learning open-source platform. In addition, some researchers have also tried using spiking neural networks in complex visual tasks.…”
Section: Spiking Neural Networkmentioning
confidence: 99%
“…In addition, some researchers have also tried using spiking neural networks in complex visual tasks. Wang and Fan (Wang et al, 2022e;Fan et al, 2023) designed a spiking neural network with multi-layer neuron combination based on empirical information, which was applied to the stereo vision system of binocular parallax and monocular zoom, respectively. Joseph et al (Joseph and Pakrashi, 2022) proposed a multi-level cascaded spiking neural network to detect candidate regions of fixed scene objects.…”
Section: Spiking Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…Fan et al (2023) [8] introduces a novel approach called IDSNN (initialization and distillation for SNNs) to address the low accuracy and high inference latency faced by SNNs. IDSNN leverages parameter initialization and knowledge distillation from ANNs, resulting in competitive top-1 accuracy for CIFAR10 (94.22%) and CIFAR100 (75.41%) with minimal latency.…”
mentioning
confidence: 99%