2021
DOI: 10.1109/access.2021.3074525
|View full text |Cite
|
Sign up to set email alerts
|

Augmenting Few-Shot Learning With Supervised Contrastive Learning

Abstract: Few-shot learning deals with a small amount of data which incurs insufficient performance with conventional cross-entropy loss. We propose a pretraining approach for few-shot learning scenarios. That is, considering that the feature extractor quality is a critical factor in few-shot learning, we augment the feature extractor using a contrastive learning technique. It is reported that supervised contrastive learning applied to base class training in transductive few-shot training pipeline leads to improved resu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 17 publications
(5 citation statements)
references
References 26 publications
0
5
0
Order By: Relevance
“…After determining the mapping function, it may be used to analyze added information for which the output value is not known [12]. If there are enough training examples and a suitable learning method is used, the algorithm will be able of generalization successfully, or give correct results for inputs that are similar but not identical to those in the dataset.…”
Section: The Methods Of Supervised Learningmentioning
confidence: 99%
“…After determining the mapping function, it may be used to analyze added information for which the output value is not known [12]. If there are enough training examples and a suitable learning method is used, the algorithm will be able of generalization successfully, or give correct results for inputs that are similar but not identical to those in the dataset.…”
Section: The Methods Of Supervised Learningmentioning
confidence: 99%
“…Ma [17] fuses advanced feature detail enhancement and multi-scale features for contextual semantic edge detection. Ju [18] proposes the attention mechanism for multi-scale target detection by adaptively learning the importance and relevance ideas of features at different scales with the Lee [19] computed enhanced feature extraction networks by contrast learning. To cope with scale variations.…”
Section: Related Workmentioning
confidence: 99%
“…Meta-learning improves deep learning models by enabling them to learn from a variety of tasks, enhancing their ability to quickly adapt to new and unseen tasks with minimal additional data. In contrast, conventional deep learning has many limitations, including the inability to learn new tasks as quickly as humans do when drawing on prior experience, and many classifiers require massive amounts of training data [15]. Furthermore, deep learning algorithm performance depends heavily on labeled data with predefined attributes, which sometimes restricts generalization.…”
Section: Meta-learningmentioning
confidence: 99%