2016
DOI: 10.48550/arxiv.1611.07725
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

iCaRL: Incremental Classifier and Representation Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
34
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(34 citation statements)
references
References 0 publications
0
34
0
Order By: Relevance
“…Early attempts to mitigate catastrophic forgetting typically consisted of memory systems that store previous data and that regularly replay old samples interleaved with samples drawn from the new data (Robins 1993(Robins , 1995, and these methods are still used today (Gepperth & Karaoguz 2015, Rebuffi et al 2016). However, a general drawback of memory-based systems is that they require explicit storage of old information, leading to large working memory requirements.…”
Section: Lifelong Machine Learningmentioning
confidence: 99%
“…Early attempts to mitigate catastrophic forgetting typically consisted of memory systems that store previous data and that regularly replay old samples interleaved with samples drawn from the new data (Robins 1993(Robins , 1995, and these methods are still used today (Gepperth & Karaoguz 2015, Rebuffi et al 2016). However, a general drawback of memory-based systems is that they require explicit storage of old information, leading to large working memory requirements.…”
Section: Lifelong Machine Learningmentioning
confidence: 99%
“…Other methods use an episodic memory, such as iCARL (incremental Classifier and Representation Learning) [29] and Memory Based Parameter Adaptation [35]. A special mention here goes to Gradient Episodic Memory for Continual Learning [20], as it moves a step forward towards the online setting: it assumes that the learner receives examples one by one but simplifies the scenario to locally i.i.d.…”
Section: Related Workmentioning
confidence: 99%
“…Their model added new neural units to the autoencoder to facilitate the addition of new MNIST digits. Rebuffi et al (2017) proposed the iCaRL approach which stores example data points that are used along with new data to dynamically adapt the weights of a feature extractor. By combining new and old data, they prevent catastrophic forgetting but at the expense of a higher memory footprint.…”
Section: Related Workmentioning
confidence: 99%