ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2022
DOI: 10.1109/icassp43922.2022.9747411
|View full text |Cite
|
Sign up to set email alerts
|

Axonal Delay as a Short-Term Memory for Feed Forward Deep Spiking Neural Networks

Abstract: The information of spiking neural networks (SNNs) are propagated between the adjacent biological neuron by spikes, which provides a computing paradigm with the promise of simulating the human brain. Recent studies have found that the time delay of neurons plays an important role in the learning process. Therefore, configuring the precise timing of the spike is a promising direction for understanding and improving the transmission process of temporal information in SNNs. However, most of the existing learning m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3

Relationship

3
6

Authors

Journals

citations
Cited by 12 publications
(7 citation statements)
references
References 23 publications
0
7
0
Order By: Relevance
“…We will follow the shorthand notation in [15] to represent the architecture: layers are separated by -, spatial dimensions are separated by x, an N × N convolution filter with K channels is represented by KcN, an N ×N aggregate pooling filter is represented by Na and a dense layer with N neurons is represented by the number itself. Note that the convolution and dense layer neurons also have trainable axonal delays [37]. In all our experiments, we use neuron threshold ϑ = 10 mv, and sampling time of 1 ms.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…We will follow the shorthand notation in [15] to represent the architecture: layers are separated by -, spatial dimensions are separated by x, an N × N convolution filter with K channels is represented by KcN, an N ×N aggregate pooling filter is represented by Na and a dense layer with N neurons is represented by the number itself. Note that the convolution and dense layer neurons also have trainable axonal delays [37]. In all our experiments, we use neuron threshold ϑ = 10 mv, and sampling time of 1 ms.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…Delays in SNN Many methods have been proposed for adapting propagation delays, inspired by spike timing dependent plasticity (Wang et al, 2013) or based on the ReSuMe learning rule (Zhang et al, 2020). A method for training per neuron axonal delays based on the SLAYER learning paradigm was proposed (Shrestha and Orchard, 2018) and extended (Sun et al, 2023b) with trainable delay caps. Recently, the effects of axonal synaptic delay learning were studied by pruning multiple delay synapses (Patiño-Saucedo et al, 2023), modeling a one-layer multinomial logistic regression with synaptic delays (Grimaldi and Perrinet, 2023) and learning delays represented trough 1D convolutions with learnable spacings (Hammouamri et al, 2023).…”
Section: Figurementioning
confidence: 99%
“…Shrestha and Orchard ( 2018 ) proposes a general backpropagation mechanism for learning synaptic weights and axonal delays which overcomes the problem of non-differentiability of the spike function and uses a temporal credit assignment policy for backpropagating error to preceding layers. Sun et al ( 2022 ) proposes the rectified axonal delay (RAD) as an additional degree of freedom for training that can easily be incorporated into existing SNN frameworks. The new model can perform well on problems where timing matters using very few parameters.…”
Section: Introductionmentioning
confidence: 99%