2021
DOI: 10.48550/arxiv.2104.04132
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Replay in Deep Learning: Current Approaches and Missing Biological Elements

Abstract: Replay is the reactivation of one or more neural patterns, which are similar to the activation patterns experienced during past waking experiences. Replay was first observed in biological neural networks during sleep, and it is now thought to play a critical role in memory formation, retrieval, and consolidation. Replay-like mechanisms have been incorporated into deep artificial neural networks that learn over time to avoid catastrophic forgetting of previous knowledge. Replay algorithms have been successfully… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(9 citation statements)
references
References 240 publications
(323 reference statements)
0
9
0
Order By: Relevance
“…This process of buffering data, or experiences, which are later replayed to cortical networks is precisely the role ascribed to the hippocampus in the complementary learning systems theory [51]. Our work therefore adds meta-learning to the repertoire of functions for which replay might be critical [52].…”
Section: Discussionmentioning
confidence: 84%
“…This process of buffering data, or experiences, which are later replayed to cortical networks is precisely the role ascribed to the hippocampus in the complementary learning systems theory [51]. Our work therefore adds meta-learning to the repertoire of functions for which replay might be critical [52].…”
Section: Discussionmentioning
confidence: 84%
“…In particular, a two stages consolidation can be envisaged where a quick (real-time) update takes place at the output layer and a slower (but more precise knowledge organization) is performed in background affecting deeper layers. This is also in line with biological learning where the hippocampus can make recent knowledge immediately accessible while consolidation in the cortex is carried out throughout sleep-awake cycles [6]. The application is equipped with a MobileNetV1-1.0 in 32-bit floating point format working with 128x128 input images [7] and leverages a fixed-size replay buffer of just 500 patterns for which only latent representations are kept.…”
Section: Core: An Android Application Based On Caffementioning
confidence: 75%
“…Several approaches for this kind of "Continual Learning" problem [1] have been developed in the A.I. space, including: dynamic architectures that can grow network capacity, weight regularization-based approaches that mitigate catastrophic forgetting by constraining the update of previous weights [2], [3], and interleaved replay of training examples from previous tasks [4]. In contrast to rote replay (e.g., using a memory buffer to store oneto-one copies of experience), generative replay [4] does not store exact copies of specific examples from previous tasks, but instead trains a network to retain higher-level/compressed representations, from which it can create de novo synthetic training samples.…”
Section: Introductionmentioning
confidence: 99%