2018
DOI: 10.1016/j.procs.2018.11.107
|View full text |Cite
|
Sign up to set email alerts
|

Spiking neural network reinforcement learning method based on temporal coding and STDP

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 11 publications
0
4
0
Order By: Relevance
“…accuracy SNNs are not inferior to ANNs but often even better. In [26] SNNs have shown comparable results to ANN and ML for the classification of the Iris data set. Same has been shown for the MNIST data set in comparison with a CNN [16], whereas the letters have been recorded with a retinomorphic data set [22] for the SNN.…”
Section: Advantages and Drawbacksmentioning
confidence: 88%
“…accuracy SNNs are not inferior to ANNs but often even better. In [26] SNNs have shown comparable results to ANN and ML for the classification of the Iris data set. Same has been shown for the MNIST data set in comparison with a CNN [16], whereas the letters have been recorded with a retinomorphic data set [22] for the SNN.…”
Section: Advantages and Drawbacksmentioning
confidence: 88%
“…Rate coding uses the firing rate of spike trains in a time window to encode information, where input real numbers are converted into spike trains with a frequency proportional to the input value (Cheng et al 2020a). Temporal coding encodes information with the relative timing of individual spikes, where input values are usually converted into spike trains with the precise time (Comsa et al 2020;Sboev et al 2018). Besides that, population coding is special in integrating these two types.…”
Section: Information Coding Methods In Snnsmentioning
confidence: 99%
“…Unlike LIF neurons, the DNs showed a higher complexity with an additional implicit U , making the dynamical changing of equilibrium points different. The little differences of U would cause a big update of V according to the definition of DNs, especially when the parameter θ b was small in Equation (19). Hence, the DNs would not only show similar firing patterns with the positive strong stimulus but also exhibit a sparse firing with the weak-positive and negative stimulus, instead of stopping firing like LIF neurons.…”
Section: Learn and Analyze Dnsmentioning
confidence: 99%
“…There are two main categories of information encoding (rate and temporal types) at the input coding scale of SNN. Rate coding uses the firing rate of spike trains in a time window to encode information, where input real numbers are converted into spike trains with a frequency proportional to the input value [14,15], and temporal coding encodes information with the relative timing of individual spikes, where input values are usually converted into spike trains with the precise time [16,17,18,19]. Besides that, population coding is special in integrating these two types.…”
Section: Introductionmentioning
confidence: 99%