2017
DOI: 10.1101/116400
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The temporal paradox of Hebbian learning and homeostatic plasticity

Abstract: Hebbian plasticity, a synaptic mechanism which detects and amplifies co-activity between neurons, is considered a key ingredient underlying learning and memory in the brain. However, Hebbian plasticity alone is unstable, leading to runaway neuronal activity, and therefore requires stabilization by additional compensatory processes. Traditionally, a diversity of homeostatic plasticity phenomena found in neural circuits are thought to play this role. However, recent modelling work suggests that the slow evolutio… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
250
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 155 publications
(250 citation statements)
references
References 116 publications
0
250
0
Order By: Relevance
“…Finally, we acknowledge that our pretraining intervention only mitigated, but did not remove, the effect of catastrophic interference. Indeed, it is very likely that other mechanisms, including weight protection schemes that allow new learning to be allocated to synapses according to their importance for past learning, will also play a key role ( 3 , 48 ). A promising avenue for future research may be to understand how structure learning and resource allocation schemes interact.…”
Section: Discussionmentioning
confidence: 99%
“…Finally, we acknowledge that our pretraining intervention only mitigated, but did not remove, the effect of catastrophic interference. Indeed, it is very likely that other mechanisms, including weight protection schemes that allow new learning to be allocated to synapses according to their importance for past learning, will also play a key role ( 3 , 48 ). A promising avenue for future research may be to understand how structure learning and resource allocation schemes interact.…”
Section: Discussionmentioning
confidence: 99%
“…Unlike biological odor learning, artificial neural networks optimized for a certain task tend to suffer from catastrophic forgetting, and the pursuit of online learning capabilities in deep networks is a subject of active study (McCloskey and Cohen, 1989; Kemker and Kanan, 2017; Kirkpatrick et al, 2017; Velez and Clune, 2017; Zenke et al, 2017; Serrà et al, 2018). In contrast, the EPLff learning network described herein naturally resists catastrophic forgetting, exhibiting powerful online learning using a fast spike timing-based coding metric.…”
Section: Resultsmentioning
confidence: 99%
“…The higher importance value denotes the importance of a particular parameter for the previous tasks. Similarly, in the synaptic intelligence (SI) approach, the importance value for each parameter is assigned according to the change in the parameter values [32]. In both the above approaches (EWC and SI), the importance is given for each synapse, while in the proposed model, the importance is given for the direction at the neuronal level.…”
Section: Comparison To Other Modelsmentioning
confidence: 99%
“…This approach keeps a copy of the diagonals of the Fisher information matrix and the latest learned parameters. Zenke et al proposed the Synaptic Intelligence model, where the importance of each parameter for the previous tasks is estimated based on the change in parameter values [32]. Based on this value, the parameters with high importance refrain from learning, and the rest of the parameters are allowed to update.…”
Section: Introductionmentioning
confidence: 99%