2021 IEEE/CVF International Conference on Computer Vision (ICCV) 2021
DOI: 10.1109/iccv48922.2021.00929
|View full text |Cite
|
Sign up to set email alerts
|

Boosting the Generalization Capability in Cross-Domain Few-shot Learning via Noise-enhanced Supervised Autoencoder

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
24
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 52 publications
(24 citation statements)
references
References 25 publications
0
24
0
Order By: Relevance
“…Typically, CD-FSL with only source data is the most strict setting that demands model to recognize totally unseen target dataset without any target information. Flagship works including FWT [44], BSCD-FSL [18], LRP [40], ATA [48], wave-SAN [14], RDC [29], NSAE [32], and ConFT [10]. Though many well-designed techniques e.g.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Typically, CD-FSL with only source data is the most strict setting that demands model to recognize totally unseen target dataset without any target information. Flagship works including FWT [44], BSCD-FSL [18], LRP [40], ATA [48], wave-SAN [14], RDC [29], NSAE [32], and ConFT [10]. Though many well-designed techniques e.g.…”
Section: Related Workmentioning
confidence: 99%
“…Though many well-designed techniques e.g. readjusting the batch normalization [44], augmenting the difficult of meta tasks [48], spanning style distributions [14], and even fine-tuning models using few target images during the testing stage [10,18,29,32], the performances of them are still greatly limited due to the huge domain gap. By contrast, STARTUP [33] relaxes this strict setting and uses unlabeled target data for training.…”
Section: Related Workmentioning
confidence: 99%
“…Our approach also follows the pre-training paradigm, and we further expect the learned representations to be compact and cross-domain aligned to address the CDCS-FSL problem. CD-FSL [15,42,29,47,22,14,10] considers the domain shift problem between the base classes and the novel classes. Due to such domain gap, [4] show that meta-learning approaches fail to adapt to novel classes.…”
Section: Related Workmentioning
confidence: 99%
“…To make matters worse, most DL methods in the state of the art still require humongous amounts of data to be trained, which is not realistic in most of the medical application domains. Therefore, in recent years, the meta-learning and Few-Shot Learning (FSL) paradigms have emerged as a means to cope with the training data scarcity problem and furthermore, to make models more capable of generalization with less computing effort and incremental learning capabilities [28].…”
Section: Introductionmentioning
confidence: 99%