2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2022
DOI: 10.1109/cvpr52688.2022.00884
|View full text |Cite
|
Sign up to set email alerts
|

Forward Compatible Few-Shot Class-Incremental Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
38
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 133 publications
(38 citation statements)
references
References 33 publications
0
38
0
Order By: Relevance
“…Träuble et al (2021) proposed an efficient probabilistic approach to locate data instances whose old predictions could be incorrect and update them with ones from the new model. Zhou et al (2022) looked into forward compatibility, where new classes can be easily incorporated without negatively impacting existing prediction behavior. More recently, Schumann et al (2023) inspected classification model regression during training data updates and mitigated the problem by interpolating between weights of the old and new models.…”
Section: Discussionmentioning
confidence: 99%
“…Träuble et al (2021) proposed an efficient probabilistic approach to locate data instances whose old predictions could be incorrect and update them with ones from the new model. Zhou et al (2022) looked into forward compatibility, where new classes can be easily incorporated without negatively impacting existing prediction behavior. More recently, Schumann et al (2023) inspected classification model regression during training data updates and mitigated the problem by interpolating between weights of the old and new models.…”
Section: Discussionmentioning
confidence: 99%
“…Great efforts have been devoted to the following two directions: identifying and preserving significant parameters of the original model [32], and memorizing the knowledge of the old model through some strategies like knowledge distillation [31]. Recently, some works focused on generalizing CIL to a limited-data regime and led to a new practical scenario, i.e., Few-Shot CIL (FSCIL) [9], [10], [11], [12], [13], [14], [15], [16]. Existing methods for FSCIL mainly employ two strategies.…”
Section: B Class-incremental Learningmentioning
confidence: 99%
“…Based on a model well-trained by a large-scale base dataset, FSCIL [9], [10], [11], [12], [13], [14], [15], [16] aims to incrementally learn new classes in limited labeled data regime without forgetting previously seen categories. This emerging research topic faces the following challenges:…”
Section: Introductionmentioning
confidence: 99%
“…We also introduce prompt regularization to improve performance and prevent forgetting. Our experimental results demonstrate that CPE-CLIP significantly improves FSCIL performance compared to state-of-the-art proposals while also drastically reducing the number of learnable parameters and training costs.Recent research has focused on solving these problems through various approaches, such as meta-learning [57, 34], regularization techniques [30], or knowledge distillation [38,6,62]. These methods have shown promising results in achieving incremental learning over time with a limited amount of data available.…”
mentioning
confidence: 99%
“…Recent research has focused on solving these problems through various approaches, such as meta-learning [57, 34], regularization techniques [30], or knowledge distillation [38,6,62]. These methods have shown promising results in achieving incremental learning over time with a limited amount of data available.…”
mentioning
confidence: 99%