2002
DOI: 10.1016/s0925-2312(01)00658-0
|View full text |Cite
|
Sign up to set email alerts
|

Error-backpropagation in temporally encoded networks of spiking neurons

Abstract: For a network of spiking neurons that encodes information in the timing of individual spike-times, we derive a supervised learning rule, SpikeProp, akin to traditional error-backpropagation and show how to overcome the discontinuities introduced by thresholding. With this algorithm, we demonstrate how networks of spiking neurons with biologically reasonable action potentials can perform complex non-linear classification in fast temporal coding just as well as rate-coded networks. We perform experiments for the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
798
1
2

Year Published

2008
2008
2020
2020

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 931 publications
(804 citation statements)
references
References 41 publications
3
798
1
2
Order By: Relevance
“…A margin criterion is applied, via a stochastic iterative learning process, for strengthening the separation between the spike-timing of the readout (output) neurons. This idea fits in the similarity that has been recently proposed [38,17] between RC and SVM, 4 where the reservoir is compared to the highdimensional feature space resulting from a kernel transformation. In our algorithm, like in the machine learning literature, the application of a margin criterion is justified by the fact that maximizing a margin between the positive and the negative class yields better expected generalization performance [48].…”
Section: Multi-timescale Learningmentioning
confidence: 82%
See 1 more Smart Citation
“…A margin criterion is applied, via a stochastic iterative learning process, for strengthening the separation between the spike-timing of the readout (output) neurons. This idea fits in the similarity that has been recently proposed [38,17] between RC and SVM, 4 where the reservoir is compared to the highdimensional feature space resulting from a kernel transformation. In our algorithm, like in the machine learning literature, the application of a margin criterion is justified by the fact that maximizing a margin between the positive and the negative class yields better expected generalization performance [48].…”
Section: Multi-timescale Learningmentioning
confidence: 82%
“…However, discovering efficient learning rules adapted to SNNs is still a hot topic. For the last 10 years, solutions were proposed for emulating classic learning rules in SNNs [24,30,4], by means of drastic simplifications that often resulted in losing precious features of firing time-based computing. As an alternative, various researchers have proposed different ways to exploit recent advances in neuroscience about synaptic plasticity [1], especially IP 2 [10,9] or STDP 3 [28,19], that is usually presented as the Hebb rule, revisited in the context of temporal coding.…”
Section: Spiking Neuron Networkmentioning
confidence: 99%
“…Our simulations used networks with two (input, output) and three layers (input, hidden and output). The two-layered networks are similar to the ones used for ReSuMe [21] and the three layered ones are similar to the ones used with SpikeProp [3].…”
Section: Networkmentioning
confidence: 99%
“…SpikeProp's application to the non linearly separable Exclusive-OR problem also follows this pattern [3]. Generally for SpikeProp-based algorithms, it is crucial that hidden layer neurons are initialised such that they spike at least once for all patterns or no error signals for that neuron and its weight arise.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation