2020
DOI: 10.3389/fncir.2020.541728
|View full text |Cite
|
Sign up to set email alerts
|

The Interplay of Synaptic Plasticity and Scaling Enables Self-Organized Formation and Allocation of Multiple Memory Representations

Abstract: It is commonly assumed that memories about experienced stimuli are represented by groups of highly interconnected neurons called cell assemblies. This requires allocating and storing information in the neural circuitry, which happens through synaptic weight adaptations at different types of synapses. In general, memory allocation is associated with synaptic changes at feed-forward synapses while memory storage is linked with adaptation of recurrent connections. It remains, however, largely unknown how memory a… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(20 citation statements)
references
References 81 publications
0
20
0
Order By: Relevance
“…The first (Osan et al, 2011) studied the transition between reconsolidation and extinction in an attractor network, with labilization driven by mismatch between the training and reactivation patterns. The second (Auth et al, 2020), used a combination of Hebbian plasticity and homeostatic synaptic scaling to allocate stimuli as internal representations in a memory network (Auth et al, 2020).…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The first (Osan et al, 2011) studied the transition between reconsolidation and extinction in an attractor network, with labilization driven by mismatch between the training and reactivation patterns. The second (Auth et al, 2020), used a combination of Hebbian plasticity and homeostatic synaptic scaling to allocate stimuli as internal representations in a memory network (Auth et al, 2020).…”
Section: Resultsmentioning
confidence: 99%
“…Code for the simulations is available at https://github.com/Felippe-espinelli/scaling_destabilization_models. Model 2 -Adaptation of Auth et al (2020) The model is composed of an input network and a memory network with 36 and 900 neurons, respectively. The memory network is a grid of 30 x 30 neurons organized in a toroidal topology; each of them connects with 4 random neurons in the input area and with their nearestneighbors in the memory area within a radius of 4 neurons.…”
Section: Computational Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…Extensive theoretical work has explored the idea that Hebbian forms of plasticity (such as LTP) can destabilize neural networks by saturating synaptic weights, and that the addition of synaptic scaling can counteract this instability to help establish or maintain memory specificity 8,9,10,17 . There is some experimental evidence that blocking pathways important for synaptic scaling can influence memory 45,46 , but it was not clear from these earlier studies whether these manipulations impacted the transition from generalized to specific memories following the induction of learning.…”
Section: Discussionmentioning
confidence: 99%
“…While LTP is generally considered critical for learning and memory 5,6,7 , it may not be sufficient to faithfully encode memories due to the positive feedback nature of Hebbian learning rules 8,9 . Left unchecked, this positive feedback could give rise to the unconstrained enhancement of synaptic strengths, which in turn might result in the degradation of memory specificity 10 . These theoretical considerations suggest that for a memory to be properly encoded, additional homeostatic plasticity mechanisms are required to counterbalance this “runaway” plasticity.…”
Section: Introductionmentioning
confidence: 99%