2022
DOI: 10.1007/978-3-031-20071-7_37
|View full text |Cite
|
Sign up to set email alerts
|

Neuromorphic Data Augmentation for Training Spiking Neural Networks

Abstract: Spiking Neural Networks (SNNs) have recently become more popular as a biologically plausible substitute for traditional Artificial Neural Networks (ANNs). SNNs are cost-efficient and deployment-friendly because they process input in both spatial and temporal manners using binary spikes. However, we observe that the information capacity in SNNs is affected by the number of timesteps, leading to an accuracy-efficiency tradeoff. In this work, we study a fine-grained adjustment of the number of timesteps in SNNs. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
4
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 42 publications
(15 citation statements)
references
References 65 publications
0
15
0
Order By: Relevance
“…Li et al [55] propose several randomized geometric augmentations for training SNNs. These include common techniques such as horizontal flip, translation, and rotation; as well as other unique techniques such as cutout, shear, and CutMix.…”
Section: Augmentation Methods For Event-based Visionmentioning
confidence: 99%
“…Li et al [55] propose several randomized geometric augmentations for training SNNs. These include common techniques such as horizontal flip, translation, and rotation; as well as other unique techniques such as cutout, shear, and CutMix.…”
Section: Augmentation Methods For Event-based Visionmentioning
confidence: 99%
“…We use VGG16 (Simonyan and Zisserman, 2015 ) and ResNet19 (He et al, 2016 ). For both architectures, we use the scaled-up channel size following previous SNN works (Zheng et al, 2020 ; Li et al, 2022b ). We train the SNNs with 128 batch samples using SGD optimizer with momentum 0.9 and weight decay 5e-4.…”
Section: Methodsmentioning
confidence: 99%
“…Many works conduct multi-stage training, typically including an ANN pre-training process, to reduce the latency (i.e., the number of time steps) for the energy efficiency issue, while maintaining competitive performance [8,9,50,51]. The BPTT with SG method has achieved high performance with low latency on both static [21,24] and neuromorphic [16,37] datasets. However, those approaches need to backpropagate error signals through both temporal and spatial domains, thus suffering from high computational costs during training [14].…”
Section: Related Workmentioning
confidence: 99%