2021
DOI: 10.1007/s11432-020-3156-7
|View full text |Cite
|
Sign up to set email alerts
|

Task-wise attention guided part complementary learning for few-shot image classification

Abstract: A general framework to tackle the problem of few-shot learning is meta-learning, which aims to train a well-generalized meta-learner (or backbone network) to learn a base-learner for each future task with small training data. Although a lot of work has produced relatively good results, there are still some challenges for few-shot image classification. First, meta-learning is a learning problem over a collection of tasks and the meta-learner is usually shared among all tasks. To achieve image classification of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3
2

Relationship

2
8

Authors

Journals

citations
Cited by 54 publications
(10 citation statements)
references
References 37 publications
0
10
0
Order By: Relevance
“…Classification is one of the most basic tasks in machine learning, and it has been applied to many fields such as natural image classification [11] , [34] , handwriting recognition [35] , medical image classification [36] , etc. In the early days, some simple methods like logistic regression [37] and naive bayes classifier [38] were proposed to solve the linear classification problem.…”
Section: Related Workmentioning
confidence: 99%
“…Classification is one of the most basic tasks in machine learning, and it has been applied to many fields such as natural image classification [11] , [34] , handwriting recognition [35] , medical image classification [36] , etc. In the early days, some simple methods like logistic regression [37] and naive bayes classifier [38] were proposed to solve the linear classification problem.…”
Section: Related Workmentioning
confidence: 99%
“…Feature-based attention has proved its effectiveness in many computer vision tasks as a perception-adapted mechanism [49]. For instance, Squeezeand-Excitation network (SENet) [50] proposed by Hu et al adaptively recalibrates channel relationships by global information embedding and fully connected (FC) layers.…”
Section: Attention Mechanismmentioning
confidence: 99%
“…humans can easily recognize new concepts or patterns from a handful of examples, which greatly stimulates the research interest of the community [39,52,53]. Thus, few-shot learning (FSL) is proposed to address this problem by building a network that can be generalized to unseen domains with scarce annotated samples available [7,42,54,57].…”
Section: Introductionmentioning
confidence: 99%