2021
DOI: 10.22541/au.162464884.44336363/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Remembering for the Right Reasons: Explanations Reduce Catastrophic Forgetting

Abstract: The goal of continual learning (CL) is to learn a sequence of tasks without suffering from the phenomenon of catastrophic forgetting. Previous work has shown that leveraging memory in the form of a replay buffer can reduce performance degradation on prior tasks. We hypothesize that forgetting can be further reduced when the model is encouraged to remember the evidence for previously made decisions. As a first step towards exploring this hypothesis, we propose a simple novel training paradigm, called Rememberin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…In this way, we could choose the smallest number of samples that contain the most influential information about each task. Several works in the literature propose methods for mitigating catastrophic forgetting in artificial neural networks that exploit explainable AI [32,33,34], and they manage to achieve comparable and, in some cases, higher performances than other state-of-the-art approaches.…”
Section: Discussionmentioning
confidence: 99%
“…In this way, we could choose the smallest number of samples that contain the most influential information about each task. Several works in the literature propose methods for mitigating catastrophic forgetting in artificial neural networks that exploit explainable AI [32,33,34], and they manage to achieve comparable and, in some cases, higher performances than other state-of-the-art approaches.…”
Section: Discussionmentioning
confidence: 99%
“…The base method, Experience Replay (ER) (Riemer et al 2018) interleaves the training of the current task with the memory samples to train the model on the approximate joint distribution of tasks. Several studies focus on the different aspects of rehearsal: memory sample selection (Isele and Cosgun 2018), sample retrieval from memory (Aljundi et al 2019a), and what information to extract and replay (Ebrahimi et al 2020;Bhat, Zonooz, and Arani 2022).…”
Section: Related Workmentioning
confidence: 99%
“…Memory-based methods address catastrophic forgetting by incorporating data from previous tasks into the training process for the current task [14,7]. These approaches can use raw samples [38,10] CIFAR-100 CORe50 The dotted lines represent the Reservoir method.…”
Section: Related Workmentioning
confidence: 99%