2016 IEEE First International Conference on Data Stream Mining &Amp; Processing (DSMP) 2016
DOI: 10.1109/dsmp.2016.7583517
|View full text |Cite
|
Sign up to set email alerts
|

A fast learning algorithm of self-learning spiking neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 5 publications
0
2
0
Order By: Relevance
“…However, in SNNs, signals are presented by streams of spike events, and flow layer by layer via spikes 190 which created by neurons, ultimately, drive firing of output neurons that collect evidence over time. This mechanism gives SNN some advantages such as efficient processing of timevarying inputs [22] and high computational performance on specialized hardware [23].…”
Section: Inference Latency 185mentioning
confidence: 99%
“…However, in SNNs, signals are presented by streams of spike events, and flow layer by layer via spikes 190 which created by neurons, ultimately, drive firing of output neurons that collect evidence over time. This mechanism gives SNN some advantages such as efficient processing of timevarying inputs [22] and high computational performance on specialized hardware [23].…”
Section: Inference Latency 185mentioning
confidence: 99%
“…However, in SNNs, signals are presented by streams of spike events, and flow layer by layer via spikes 190 which created by neurons, ultimately, drive firing of output neurons that collect evidence over time. This mechanism gives SNN some advantages such as efficient processing of timevarying inputs [22] and high computational performance on specialized hardware [23]. 195 However, it also implies that even for a time-invariant input, network output maybe varies over time, especially at the beginning of the spike signal input to the network because that sufficient spike evidence has not been collected by the output neurons.…”
Section: Inference Latency 185mentioning
confidence: 99%