2014
DOI: 10.1016/j.neuron.2014.03.026
|View full text |Cite
|
Sign up to set email alerts
|

Learning Precisely Timed Spikes

Abstract: To signal the onset of salient sensory features or execute well-timed motor sequences, neuronal circuits must transform streams of incoming spike trains into precisely timed firing. To address the efficiency and fidelity with which neurons can perform such computations, we developed a theory to characterize the capacity of feedforward networks to generate desired spike sequences. We find the maximum number of desired output spikes a neuron can implement to be 0.1-0.3 per synapse. We further present a biologica… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

4
148
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
4
3
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 133 publications
(152 citation statements)
references
References 62 publications
4
148
0
Order By: Relevance
“…Similarly, Gardner and Grüning (2016) and Albers, Westkott, and Pawelzik (2016) have studied the convergence properties of rules that reduce the van Rossum distance by gradient descent. Moreover, Memmesheimer, Rubin, Ölveczky, & Sompolinsky (2014) proposed a learning algorithm that achieves high capacity in learning long, precisely timed spike trains in single units and recurrent networks. The problem of sequence learning in recurrent neural networks has also been studied as a variational learning problem (Brea, Senn, & Pfister, 2013; Jimenez Rezende & Gerstner, 2014) and by combining adaptive control theory with heterogeneous neurons (Gilra & Gerstner, 2017).…”
Section: Introductionmentioning
confidence: 99%
“…Similarly, Gardner and Grüning (2016) and Albers, Westkott, and Pawelzik (2016) have studied the convergence properties of rules that reduce the van Rossum distance by gradient descent. Moreover, Memmesheimer, Rubin, Ölveczky, & Sompolinsky (2014) proposed a learning algorithm that achieves high capacity in learning long, precisely timed spike trains in single units and recurrent networks. The problem of sequence learning in recurrent neural networks has also been studied as a variational learning problem (Brea, Senn, & Pfister, 2013; Jimenez Rezende & Gerstner, 2014) and by combining adaptive control theory with heterogeneous neurons (Gilra & Gerstner, 2017).…”
Section: Introductionmentioning
confidence: 99%
“…The main reason for the interest in this coding scheme is that sparse coding can have beneficial features for both memory capacity [1,2] and speed of learning [3], as illustrated in various simulations of brain circuitry function [4][5][6][7][8][9]. From a simulation point of view, it provides the additional advantage of reduced computational load, and is therefore popular in technical applications that for example involve artificial neural networks [10,11].…”
mentioning
confidence: 99%
“…synaptic delays, which are very difficult to obtain. Additionally, Memmesheimer et al [22] proposed an inference algorithm based on the Perceptron learning rule, similar to Baldassi et al [23], for which they proved that under accurate spike times it identifies a simple n-to-1 feed forward network. They also proposed a heuristic extension that works with finite precision in recorded spike times.…”
Section: Related Workmentioning
confidence: 99%