2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition 2018
DOI: 10.1109/cvpr.2018.00131
|View full text |Cite
|
Sign up to set email alerts
|

Learning to Compare: Relation Network for Few-Shot Learning

Abstract: We present a conceptually simple, flexible, and general framework for few-shot learning, where a classifier must learn to recognise new classes given only few examples from each. Our method, called the Relation Network (RN), is trained end-to-end from scratch. During meta-learning, it learns to learn a deep distance metric to compare a small number of images within episodes, each of which is designed to simulate the few-shot setting. Once trained, a RN is able to classify images of new classes by computing rel… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

4
2,749
1
3

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 3,744 publications
(2,757 citation statements)
references
References 26 publications
4
2,749
1
3
Order By: Relevance
“…Few-shot classification predicts image class labels with access to few training examples. Prior work can be broadly divided into three groups: transfer learning of models trained on classes similar to the target classes [28,22,29,30], meta-learning approaches that learn how to effectively learn new classes from small datasets [8,18], and generative approaches aimed at data-augmentation [31,25].…”
Section: Related Workmentioning
confidence: 99%
“…Few-shot classification predicts image class labels with access to few training examples. Prior work can be broadly divided into three groups: transfer learning of models trained on classes similar to the target classes [28,22,29,30], meta-learning approaches that learn how to effectively learn new classes from small datasets [8,18], and generative approaches aimed at data-augmentation [31,25].…”
Section: Related Workmentioning
confidence: 99%
“…Current meta-learning approaches that find a modelweight initialization are typically evaluated by applying them to few-shot classification problems, because it is generally easier to generate the necessary number of tasks required for meta-learning when dealing with few-shot tasks. Few-shot learning [6,21,4] strives to achieve the highest possible classification performance when faced with a new task that comprises only a handful of samples per class. This can be achieved by learning an initialization that converges fast, even when only few instances are given, but also through the application of other meta-learning approaches.…”
Section: Related Workmentioning
confidence: 99%
“…Another category of meta-learning approaches is referred to as Transfer Learning [20,14]. It describes the process of training a model on different auxiliary tasks and then using the learned model to actually fit to the target problem to improve performance.…”
Section: Related Workmentioning
confidence: 99%
“…1) Model: The few-shot learning-based classifier is trained within the framework of meta-learning [17]. The key idea is to learn the transferred knowledge among a large number of similar few-shot tasks, which can be further used for the new tasks [17], [45], [46]. Each few-shot task includes a support set and a query set.…”
Section: B Food Item Recognitionmentioning
confidence: 99%