2020
DOI: 10.48550/arxiv.2011.06782
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Nested Bi-level Optimization Framework for Robust Few Shot Learning

Abstract: Model-Agnostic Meta-Learning (MAML) is a popular gradient-based meta-learning framework that tries to find an optimal initialization to minimize the expected loss across all tasks during metatraining. However, it inherently assumes that the contribution of each instance/task to the meta-learner is equal. Therefore, it fails to address the problem of domain differences between base and novel classes in few-shot learning. In this work, we propose a novel and robust meta-learning algorithm, called RW-MAML, which … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 12 publications
0
2
0
Order By: Relevance
“…This allows the model to better filter out noisy or mislabeled data during training. Other works, such RW-MAML (Killamsetty et al 2020), AQ (Goldblum, Fowl, and Goldstein 2020), and DFSL tackle the out-of-distribution tasks or adversarial attacks in few-shot learning.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…This allows the model to better filter out noisy or mislabeled data during training. Other works, such RW-MAML (Killamsetty et al 2020), AQ (Goldblum, Fowl, and Goldstein 2020), and DFSL tackle the out-of-distribution tasks or adversarial attacks in few-shot learning.…”
Section: Related Workmentioning
confidence: 99%
“…Prior works on noisy few-shot learning (NFSL) attempt to address this issue using techniques such as feature aggregation (Liang et al 2022), data augmentation (Mazumder, Singh, and Namboodiri 2021), and example re-weighting (Killamsetty et al 2020). They have achieved moderate performance in noisy few-shot learning, but are fundamentally limited from two key aspects.…”
Section: Introductionmentioning
confidence: 99%