2021
DOI: 10.1007/s00521-021-05832-y
|View full text |Cite
|
Sign up to set email alerts
|

Quantized STDP-based online-learning spiking neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 25 publications
(10 citation statements)
references
References 40 publications
0
10
0
Order By: Relevance
“…η post /η pre represent the pre- and postsynaptic learning rates. To mimic quantized weight updates typically observed in such device technologies ( 49 ), the weights in the network-level simulations were quantized usingwq=roundfalse[false(2N1false)×wfalse]2N1…”
Section: Methodsmentioning
confidence: 99%
“…η post /η pre represent the pre- and postsynaptic learning rates. To mimic quantized weight updates typically observed in such device technologies ( 49 ), the weights in the network-level simulations were quantized usingwq=roundfalse[false(2N1false)×wfalse]2N1…”
Section: Methodsmentioning
confidence: 99%
“…The most common model to train SNN is called synaptic timedependent plasticity (STDP) unsupervised approach [26]. The STDP is utilized along with the side restrained fit spiking threshold to learn exemplification for input spike paradigms which are appropriate for classification [27]. The spikes are encoded by converting the input wave signal into a sequence of spikes "spiketrains" in a process called "encoding".…”
Section: Issn: 2088-8708 mentioning
confidence: 99%
“…In reality, from a system design perspective, we need to have interleaved synaptic device state update phases that do not interfere with the neuron oscillation behavior (for instance, through decoupled write-read phases of three-terminal synaptic devices; Sengupta et al, 2016 ). The convergence was also not affected with reduced programming resolution of the synaptic connections (4-bits), thereby indicating resiliency to quantization (Hu et al, 2021 ).…”
Section: Binding Problemmentioning
confidence: 99%