Proceedings of the Twenty-Fourth Annual Conference of the Cognitive Science Society 2019
DOI: 10.4324/9781315782379-50
|View full text |Cite
|
Sign up to set email alerts
|

Preventing Catastrophic Interference in Multiple-Sequence Learning Using Coupled Reverberating Elman Networks

Abstract: Everyone agree s that real cognition requires much more than static pattern recognition. In particular, it requires the ability to learn sequences of patterns (or actions) But learning sequences really means being able to learn multiple sequences, one after the other, wi thout the most recently learned ones erasing the previously learned ones. But if catastrophic interference is a problem for the sequential learning of individual patterns, the problem is amplified many times over when multiple sequences of pat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
2
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 16 publications
0
6
0
Order By: Relevance
“…In the same period, French introduced the pseudo-recurrent connectionist network [41,40], a model which makes use of pseudo replay [92] (replay based on random, but fixed, input patterns), but did not address sequential data processing tasks. Later on, the pseudo recurrent network together with pseudo-replay inspired the Reverberating Simple Recurrent Network (RSRN) [9,8]. This is a dual model composed by two auto-associative recurrent networks which exchange information by means of pseudo patterns.…”
Section: Survey Of Continual Learning In Recurrent Modelsmentioning
confidence: 99%
“…In the same period, French introduced the pseudo-recurrent connectionist network [41,40], a model which makes use of pseudo replay [92] (replay based on random, but fixed, input patterns), but did not address sequential data processing tasks. Later on, the pseudo recurrent network together with pseudo-replay inspired the Reverberating Simple Recurrent Network (RSRN) [9,8]. This is a dual model composed by two auto-associative recurrent networks which exchange information by means of pseudo patterns.…”
Section: Survey Of Continual Learning In Recurrent Modelsmentioning
confidence: 99%
“…It is defined as the part of the prefrontal cortex that receives projections from the medial dorsal nucleus of the thalamus, and is thought to represent emotion and reward in decision making [28]. The prefrontal cortex, consisting of Brodmann areas 8,9,10,11,12,13,44,45,46 and 47, includes the OFC but covers a wider range of functionality.…”
Section: Reciprocitymentioning
confidence: 99%
“…The right side of Figure 4 (B) highlights reciprocal connections between cortical areas -Brodmann areas 7,8,9,11,12,13,19,20,21,22,23 and 46 -and the hippocampal complex via the adjacent perirhinal cortex (shown as blue connections) and the parahippocampal cortex (shown as red connections) that are involved in representing and recognizing objects and environmental scenes.…”
Section: Reciprocitymentioning
confidence: 99%
See 1 more Smart Citation
“…The main problem with Creative is the rather large number of robots in L < 0. This might be attributed to Catastrophic Interference [10] in the memory, as many of these robots did not have any wall contact during pretraining. Forgetting of previous knowledge may also occur in the evaluation network, but with a second ANN (memory) in the neurocontroller this effect may be more pronounced.…”
Section: Resultsmentioning
confidence: 99%