2022
DOI: 10.48550/arxiv.2203.06953
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Forward Compatible Few-Shot Class-Incremental Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
4

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(11 citation statements)
references
References 41 publications
0
11
0
Order By: Relevance
“…We compare our proposed method with 6 other methods: iCaRL [30], TOPIC [38], IDLVQ-C [4], F2M [34], CEC [45], C-FSCIL [14] and FACT [49]. Overall, our proposed M-FSCIL outperforms all state-of-the-art methods on these three benchmark datasets.Because our model's backbone is pre-trained by various image-text pairs, it performs better on the object classification dataset, such as miniImageNet.…”
Section: Experimental Results Analysismentioning
confidence: 92%
“…We compare our proposed method with 6 other methods: iCaRL [30], TOPIC [38], IDLVQ-C [4], F2M [34], CEC [45], C-FSCIL [14] and FACT [49]. Overall, our proposed M-FSCIL outperforms all state-of-the-art methods on these three benchmark datasets.Because our model's backbone is pre-trained by various image-text pairs, it performs better on the object classification dataset, such as miniImageNet.…”
Section: Experimental Results Analysismentioning
confidence: 92%
“…Few-shot class-incremental learning is recently proposed to tackle few-shot inputs in the class-incremental setting [68], [69].…”
Section: Few-shot Class-incremental Learningmentioning
confidence: 99%
“…The former group seeks to rehearse former knowledge when learning new, and the latter group saves extra model components to assist incremental learning. There are other methods that do not fall into these two groups [24,29,23,46,61,66], and we refer the readers to [10,36,62] for a holistic review. Exemplar-Based Methods: Exemplars are representative instances from former classes [51], and CIL models can selectively save a relatively small amount of exemplars for rehearsal during updating [22].…”
Section: Related Workmentioning
confidence: 99%