2023
DOI: 10.1103/physrevx.13.011009
|View full text |Cite
|
Sign up to set email alerts
|

Forgetting Leads to Chaos in Attractor Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
11
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 14 publications
(13 citation statements)
references
References 74 publications
2
11
0
Order By: Relevance
“…p is defined as the maximal number of patterns that are still retrievable or, equivalently, the maximal age at which patterns can still be retrieved. This scenario is similar to previously studied attractor neural network models with online learning and decay term in the synaptic updates [ 27 , 31 ], or networks with binary synapses [ 18 , 32 , 33 ] in which memories are forgotten exponentially.…”
Section: Modelsupporting
confidence: 62%
See 2 more Smart Citations
“…p is defined as the maximal number of patterns that are still retrievable or, equivalently, the maximal age at which patterns can still be retrieved. This scenario is similar to previously studied attractor neural network models with online learning and decay term in the synaptic updates [ 27 , 31 ], or networks with binary synapses [ 18 , 32 , 33 ] in which memories are forgotten exponentially.…”
Section: Modelsupporting
confidence: 62%
“…As demonstrated in Fig 7 , the storage capacity of the network increases logarithmically with the network size N , with a prefactor that is proportional to τ = 1/(2 r 1 ) as indicated by Eq (10) . This case is similar to the model with continuous synapses and exponentially decaying memories, as discussed in [ 27 , 28 , 31 ].…”
Section: Resultsmentioning
confidence: 82%
See 1 more Smart Citation
“…Note that this plasticity rule is similar to previously studied 'palimpsest' schemes, see e.g. (Mézard et al, 1986;Pereira-Obilinovic et al, 2023). For simplicity, we assume that the rates r µ i used to compute synaptic changes during presentation of an image µ are entirely determined by external inputs, r µ i = ϕ(ξ µ i ) for all i when image µ is presented.…”
Section: Assumptionsmentioning
confidence: 91%
“…T heoretical constructs called attractor networks provide a model for memory in the brain. A new study of such networks traces the route by which memories are stored and ultimately forgotten [1]. The mathematical model and simulations show that, as they age, memories recorded in patterns of neural activity become chaotic-impossible to predict-before disintegrating into random noise.…”
mentioning
confidence: 99%