2019
DOI: 10.1016/j.neunet.2019.01.012
|View full text |Cite
|
Sign up to set email alerts
|

Continual lifelong learning with neural networks: A review

Abstract: Humans and animals have the ability to continually acquire, fine-tune, and transfer knowledge and skills throughout their lifespan. This ability, referred to as lifelong learning, is mediated by a rich set of neurocognitive mechanisms that together contribute to the development and specialization of our sensorimotor skills as well as to long-term memory consolidation and retrieval. Consequently, lifelong learning capabilities are crucial for computational systems and autonomous agents interacting in the real w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

3
1,167
0
5

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 2,156 publications
(1,327 citation statements)
references
References 165 publications
3
1,167
0
5
Order By: Relevance
“…As a result, the formation of cross-modal memories becomes a long-term dynamic process. Understanding the long-term dynamics of cortical memory representation in multimodal environment is not only a worthwhile topic by itself in brain research, but also significant for inspiring the enhancement of cross-model learning abilities of artificial brains [6].…”
Section: Introductionmentioning
confidence: 99%
“…As a result, the formation of cross-modal memories becomes a long-term dynamic process. Understanding the long-term dynamics of cortical memory representation in multimodal environment is not only a worthwhile topic by itself in brain research, but also significant for inspiring the enhancement of cross-model learning abilities of artificial brains [6].…”
Section: Introductionmentioning
confidence: 99%
“…However, multitask models could help improve generalizability when introduced as an intermediary. Lifelong learning has been used to help models learn from separate but related tasks in stages or continuously and could be used to facilitate multitask training . Lifelong learning protocols have been used to transfer knowledge acquired on old tasks to new ones to improve generalization and facilitate model convergence.…”
Section: Introductionmentioning
confidence: 99%
“…As for neural network systems, the most straightforward and pragmatic method to avoid catastrophic forgetting is to retrain a deep learning model completely from scratch with all the old data and new data (Parisi et al, 2018). However, this method is proved to be very inefficient (Parisi et al, 2018). Moreover, the new model learned from scratch may share very low similarity with the old one, which results in poor learning robustness.…”
Section: Introductionmentioning
confidence: 99%
“…To accommodate the new knowledge, these methods dynamically allocate neural resources or retrain the model with an increasing number of neurons or layers. Intuitively, these approaches can prevent catastrophic forgetting but may also lead to scalability and generalization issues due to the increasing complexity of the network (Parisi et al, 2018). The last category utilizes the dual-memory learning system, which is inspired by the CLS theory (Hinton and Plaut, 1987;Lopez-Paz and Ranzato, 2017;Gepperth and Karaoguz, 2016).…”
Section: Introductionmentioning
confidence: 99%