2020
DOI: 10.1101/2020.12.19.423464
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Free recall scaling laws and short-term memory effects in a latching attractor network

Abstract: Despite the complexity of human memory, paradigms like free recall have revealed robust qualitative and quantitative characteristics, such as power laws governing recall capacity. Although abstract random matrix models could explain such laws, the possibility of their implementation in large networks of interacting neurons has so far remained unexplored. We study an attractor network model of long-term memory endowed with firing rate adaptation and global inhibition. Under appropriate conditions, the transitio… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 76 publications
0
2
0
Order By: Relevance
“…An event or item sequence is mapped to a path through a network of cognitive states, whereupon the states’ neural representations are summed into a path vector from which the sequence can later be retrieved. Networks of hidden states have found success in accounting for animal behavior and neural activity (36, 37) and human free recall (6, 38, 39); however, it is unknown how sequences of states are stored (e.g. to repeat back a series of stimuli or actions), possibly retained in a unique compact representation, and retrieved.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…An event or item sequence is mapped to a path through a network of cognitive states, whereupon the states’ neural representations are summed into a path vector from which the sequence can later be retrieved. Networks of hidden states have found success in accounting for animal behavior and neural activity (36, 37) and human free recall (6, 38, 39); however, it is unknown how sequences of states are stored (e.g. to repeat back a series of stimuli or actions), possibly retained in a unique compact representation, and retrieved.…”
Section: Discussionmentioning
confidence: 99%
“…Although a comprehensive picture has not emerged, models for sequence memory abound at many levels. Biophysically, sequences could be stored in synaptic changes (46) or graded intracellular states reactivated by oscillations (1, 7). From the viewpoint of distributed coding, models such as the temporal context model or vector symbolic architectures (8, 9) encode sequences by binding high-dimensional “item vectors” to positional or context vectors via e.g.…”
mentioning
confidence: 99%