2021
DOI: 10.48550/arxiv.2109.04898
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

LibFewShot: A Comprehensive Library for Few-shot Learning

Abstract: Few-shot learning, especially few-shot image classification, has received increasing attention and witnessed significant advances in recent years. Some recent studies implicitly show that many generic techniques or "tricks", such as data augmentation, pre-training, knowledge distillation, and self-supervision, may greatly boost the performance of a few-shot learning method. Moreover, different works may employ different software platforms, different training schedules, different backbone architectures and even… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 26 publications
0
4
0
Order By: Relevance
“…The current FSL mainly focuses on three ways (Li et al 2021): (a) Fine-tuning based methods, (b) Meta-learning based methods, and (c) Metric-learning based methods. For the fine-tuning based methods (Chen et al 2019;Liu et al 2020a;Rajasegaran et al 2020;Dhillon et al 2019;Tian et al 2020;Yang, Liu, and Xu 2021), it follows the standard transfer learning procedure (Weiss, Khoshgoftaar, and Wang 2016), which is pre-training with the base classes at first and then fine-tuning with the novel class.…”
Section: Related Workmentioning
confidence: 99%
“…The current FSL mainly focuses on three ways (Li et al 2021): (a) Fine-tuning based methods, (b) Meta-learning based methods, and (c) Metric-learning based methods. For the fine-tuning based methods (Chen et al 2019;Liu et al 2020a;Rajasegaran et al 2020;Dhillon et al 2019;Tian et al 2020;Yang, Liu, and Xu 2021), it follows the standard transfer learning procedure (Weiss, Khoshgoftaar, and Wang 2016), which is pre-training with the base classes at first and then fine-tuning with the novel class.…”
Section: Related Workmentioning
confidence: 99%
“…CUB200 contains 11788 images of birds in 200 species, which is widely used for fine-grained classification. Following the previous work [25], we split the categories into 130, 20, 50 for training, validation and testing.…”
Section: Datasetsmentioning
confidence: 99%
“…Label smoothing is considered to be effective as a general technique to improve the performance of few-shot learning methods [27] . Therefore, label smoothing was added to the architecture.…”
Section: Mmel Architecturementioning
confidence: 99%