The 2011 International Joint Conference on Neural Networks 2011
DOI: 10.1109/ijcnn.2011.6033476
|View full text |Cite
|
Sign up to set email alerts
|

Perturbation theory for stochastic learning dynamics

Abstract: On-line machine learning and biological spiketiming-dependent plasticity (STDP) rules both generate Markov chains for the synaptic weights. We give a perturbation expansion (in powers of the learning rate) for the dynamics that, unlike the usual approximation by a Fokker-Planck equation (FPE), is rigorous. Our approach extends the related system size expansion by giving an expansion for the probability density as well as its moments. Applied to two observed STDP learning rules, our approach provides better agr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2017
2017
2017
2017

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 24 publications
0
1
0
Order By: Relevance
“…However, for systems with only one species a general solution to all orders has recently been derived [108]. A similar approach has been developed in [120] for discrete-time models in neuroscience. Note that the variables described by the system size expansion are typically assumed to be continuous.…”
Section: Properties and Recent Developmentsmentioning
confidence: 99%
“…However, for systems with only one species a general solution to all orders has recently been derived [108]. A similar approach has been developed in [120] for discrete-time models in neuroscience. Note that the variables described by the system size expansion are typically assumed to be continuous.…”
Section: Properties and Recent Developmentsmentioning
confidence: 99%