2022
DOI: 10.48550/arxiv.2207.08224
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning with Recoverable Forgetting

Abstract: Life-long learning aims at learning a sequence of tasks without forgetting the previously acquired knowledge. However, the involved training data may not be life-long legitimate due to privacy or copyright reasons. In practical scenarios, for instance, the model owner may wish to enable or disable the knowledge of specific tasks or specific samples from time to time. Such flexible control over knowledge transfer, unfortunately, has been largely overlooked in previous incremental or decremental learning methods… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 28 publications
0
2
0
Order By: Relevance
“…Recent work have studied machine unlearning for deep neural networks in various setups, which can be categorized into two approaches: class-wise and instance-wise unlearning. Class-wise unlearning forgets all data points belonging to a certain class (e.g., all images of dogs in CIFAR-10) while retaining performance on the remaining classes (Tarun et al 2021;Chundawat et al 2022;Ye et al 2022;Yoon et al 2022;Graves, Nagisetty, and Ganesh 2021). In contrast, instance-wise unlearning deletes information from individual data points with mixed classes (Golatkar, Achille, and Soatto 2020; Kim and Woo 2022;Mehta et al 2022).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Recent work have studied machine unlearning for deep neural networks in various setups, which can be categorized into two approaches: class-wise and instance-wise unlearning. Class-wise unlearning forgets all data points belonging to a certain class (e.g., all images of dogs in CIFAR-10) while retaining performance on the remaining classes (Tarun et al 2021;Chundawat et al 2022;Ye et al 2022;Yoon et al 2022;Graves, Nagisetty, and Ganesh 2021). In contrast, instance-wise unlearning deletes information from individual data points with mixed classes (Golatkar, Achille, and Soatto 2020; Kim and Woo 2022;Mehta et al 2022).…”
Section: Related Workmentioning
confidence: 99%
“…While several machine unlearning approaches have shown promising results deleting data from traditional machine learning algorithms (Mahadevan and Mathioudakis 2021;Ginart et al 2019;Brophy and Lowd 2021) as well as DNN-based classifiers (Tarun et al 2021;Chundawat et al 2022;Ye et al 2022;Yoon et al 2022;Golatkar, Achille, and Soatto 2020;Kim and Woo 2022;Mehta et al 2022), existing work are built upon assumptions far too restrictive compared to real-life scenarios. First off, many approaches assume a class-wise unlearning setup, where the task is to delete information from all data points that belong to a particular class or set of classes.…”
Section: Introductionmentioning
confidence: 99%