2000
DOI: 10.1038/81453
|View full text |Cite
|
Sign up to set email alerts
|

Synaptic plasticity: taming the beast

Abstract: Synaptic plasticity provides the basis for most models of learning, memory and development in neural circuits. To generate realistic results, synapse-specific Hebbian forms of plasticity, such as long-term potentiation and depression, must be augmented by global processes that regulate overall levels of neuronal and network activity. Regulatory processes are often as important as the more intensively studied Hebbian processes in determining the consequences of synaptic plasticity for network function. Recent e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

31
1,477
3
8

Year Published

2006
2006
2018
2018

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 1,937 publications
(1,550 citation statements)
references
References 46 publications
31
1,477
3
8
Order By: Relevance
“…For the last 10 years, solutions were proposed for emulating classic learning rules in SNNs [24,30,4], by means of drastic simplifications that often resulted in losing precious features of firing time-based computing. As an alternative, various researchers have proposed different ways to exploit recent advances in neuroscience about synaptic plasticity [1], especially IP 2 [10,9] or STDP 3 [28,19], that is usually presented as the Hebb rule, revisited in the context of temporal coding. A current trend is to propose computational justifications for plasticity-based learning rules, in terms of entropy minimization [5] as well as log-likelihood [35] or mutual information maximization [8,46,7].…”
Section: Spiking Neuron Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…For the last 10 years, solutions were proposed for emulating classic learning rules in SNNs [24,30,4], by means of drastic simplifications that often resulted in losing precious features of firing time-based computing. As an alternative, various researchers have proposed different ways to exploit recent advances in neuroscience about synaptic plasticity [1], especially IP 2 [10,9] or STDP 3 [28,19], that is usually presented as the Hebb rule, revisited in the context of temporal coding. A current trend is to propose computational justifications for plasticity-based learning rules, in terms of entropy minimization [5] as well as log-likelihood [35] or mutual information maximization [8,46,7].…”
Section: Spiking Neuron Networkmentioning
confidence: 99%
“…linear units, sigmoid neurons, threshold gates or spiking neurons (e.g. LIF 1 ), as far as the internal network behaves like a nonlinear dynamical system. In most models, simple learning rules, such as linear regression or recursive least mean squares, are applied to readout neurons only.…”
Section: Reservoir Computingmentioning
confidence: 99%
“…This type of homeostatic regulation has been demonstrated in various experimental settings (Turrigiano et al 1998;Turrigiano 2007) and its theoretical properties have been studied in computational models (Abbott & Nelson 2000). The precise biological signalling mechanisms underlying synaptic scaling remain poorly understood, however.…”
Section: Introductionmentioning
confidence: 99%
“…However, it is also clear that the formation of these ensembles is mediated through rapid anatomical/physiological changes. It has been established that the temporally ordered co-activation of neural populations leads to rapid synaptic changes via spike-timing-dependent synaptic modification (spike-timingdependent plasticity, STDP) processes (Bi & Poo 1998;Abbott & Nelson 2000;Song et al 2000;Song & Abbott 2001). Since these synaptic modifications are directional, one would also expect changes in directional relationships between the firing patterns of neurons.…”
Section: Introductionmentioning
confidence: 99%