2020
DOI: 10.1101/2020.01.08.898528
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

High capacity and dynamic accessibility in associative memory networks with context-dependent neuronal and synaptic gating

Abstract: Context, such as behavioral state, is known to modulate memory formation and retrieval, but is usually ignored in associative memory models. Here, we propose several types of contextual modulation for associative memory networks that greatly increase their performance. In these networks, context inactivates specific neurons and connections, which modulates the effective connectivity of the network. Memories are stored only by the active components, thereby reducing interference from memories acquired in other … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 129 publications
(301 reference statements)
0
4
0
Order By: Relevance
“…The abstractions in these models have enabled extensive theoretical results related to these attractor networks, including the theoretical limit for the number of patterns or concepts stored in a recurrent network (Amit et al, 1985;Gardner, 1988). Various aspects in the models have also been made more biologically realistic, for example sparse coding (Amari, 1989;Gripon et al, 2016), asymmetric connections (Tsodyks, 1989) or context dependency (Podlaski et al, 2020). It is interesting to compare the abstracted learning rules of associative memory models with their more biologically plausible local and dynamic learning counterparts, as they are used in studies on assembly formation.…”
Section: Discussionmentioning
confidence: 99%
“…The abstractions in these models have enabled extensive theoretical results related to these attractor networks, including the theoretical limit for the number of patterns or concepts stored in a recurrent network (Amit et al, 1985;Gardner, 1988). Various aspects in the models have also been made more biologically realistic, for example sparse coding (Amari, 1989;Gripon et al, 2016), asymmetric connections (Tsodyks, 1989) or context dependency (Podlaski et al, 2020). It is interesting to compare the abstracted learning rules of associative memory models with their more biologically plausible local and dynamic learning counterparts, as they are used in studies on assembly formation.…”
Section: Discussionmentioning
confidence: 99%
“…Even the co-activation of C1 and P0, is not enough, to automatically activate P1 on its own, in the context of the static model without adaptation, or inhibition. In our framework the activation of one context favors the recall of concepts associated to it, and it can be qualitative compared to neuron-specific gating model proposed in [ 30 ], where the activation of one context defines a subset of available neurons.…”
Section: Discussionmentioning
confidence: 99%
“…A powerful solution is to create separate networks (Aljundi et al 2017), or subnetworks (Rusu et al 2016, Li & Hoiem 2017, for each task (context) and use an input representing the current context (task identity) to index the appropriate (sub)network that needs to be called upon. To avoid having to grow the network with time, which may limit generalization across tasks and require biologically implausible mechanisms, fixed network architectures with context-dependent gating have also been studied (Masse et al 2018, Podlaski et al 2020, Flesch et al 2022. In these networks, a fraction of the units is used for each task (while the rest are gated by setting their activities to 0).…”
Section: Continual Learningmentioning
confidence: 99%