2022
DOI: 10.1109/tnnls.2021.3071976
|View full text |Cite
|
Sign up to set email alerts
|

Temporal Coding in Spiking Neural Networks With Alpha Synaptic Function: Learning With Backpropagation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 45 publications
(27 citation statements)
references
References 79 publications
0
27
0
Order By: Relevance
“…Despite the efforts to improve the efficiency of TTFS coding in deep SNNs [17,36], their improvements have been restricted by the conversionbased training algorithms. In several studies, SNNs were directly trained, but their methods were not validated as applicable to deep SNNs [12,37,38]. A recent study suggested direct training methods…”
Section: Training Methods Of Deep Snnsmentioning
confidence: 99%
“…Despite the efforts to improve the efficiency of TTFS coding in deep SNNs [17,36], their improvements have been restricted by the conversionbased training algorithms. In several studies, SNNs were directly trained, but their methods were not validated as applicable to deep SNNs [12,37,38]. A recent study suggested direct training methods…”
Section: Training Methods Of Deep Snnsmentioning
confidence: 99%
“…We use a neuronal model previously described by Comşa et al (2021). Upon spiking, an input neuron, indexed by i, produces an increase over time t in the temporal membrane of a downstream (output) neuron described by an α function (Sterratt et al, 2018) of the form w i (t − t i )e −τ (t−t i ) , where:…”
Section: Spiking Neuron Modelmentioning
confidence: 99%
“…Given a set of input spikes t i∈{1..n} and their corresponding weights w i∈{1..n} , the output spike time is given by (refer to Comşa et al, 2021 for the full derivation):…”
Section: Spiking Neuron Modelmentioning
confidence: 99%
See 2 more Smart Citations