2021
DOI: 10.48550/arxiv.2108.11569
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Robust Long-Tailed Learning under Label Noise

Abstract: Long-tailed learning has attracted much attention recently, with the goal of improving generalisation for tail classes. Most existing works use supervised learning without considering the prevailing noise in the training dataset. To move long-tailed learning towards more realistic scenarios, this work investigates the label noise problem under long-tailed label distribution. We first observe the negative impact of noisy labels on the performance of existing methods, revealing the intrinsic challenges of this p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 24 publications
0
7
0
Order By: Relevance
“…First, for mislabeling, we inject class-dependent label noise. Given a noise rate τ , the presence of the i-th class is mislabeled as that of the j-th class with a probability of ρ i→j ; we follow the protocol used for a long-tail noisy label setup (Wei et al 2021). For the two different classes i and j,…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…First, for mislabeling, we inject class-dependent label noise. Given a noise rate τ , the presence of the i-th class is mislabeled as that of the j-th class with a probability of ρ i→j ; we follow the protocol used for a long-tail noisy label setup (Wei et al 2021). For the two different classes i and j,…”
Section: Discussionmentioning
confidence: 99%
“…A considerable effort has also been made to use semi-supervised learning, as in DivideMix (Li, Socher, and Hoi 2020) and PES (Bai et al 2021). In addition, a few studies have addressed class imbalance in the noisy single-label setup (Wei et al 2021;Ding et al 2022), but they cannot be immediately applied to the multi-label setup owing to their inability to handle the various types of label noise caused by the nature of having both clean and incorrect labels in one instance.…”
Section: Related Workmentioning
confidence: 99%
“…Self-training aims to learn well-performing models from a small number of labeled samples and massive unlabeled samples [143], [144], [145]. To be specific, it firstly uses labeled samples to train a supervised model, which is then applied to generate pseudo labels for unlabeled data.…”
Section: Transfer Learningmentioning
confidence: 99%
“…Robust long-tailed learning. Real-world long-tailed samples may also suffer image noise [95], [171] or label noise [140], [145]. Most long-tailed methods, however, assume all images and labels are clean, leading to poor model robustness in practical applications.…”
Section: New Task Settingsmentioning
confidence: 99%
“…There are a large body of recent works on learning with noisy labels, which include but do not limit to estimating the noise transition matrix [9,20,53,54], reweighting examples [38,44,45,47], selecting confident examples [4,25,33,56], designing robust loss functions [10,12,49,64], introducing regularization [5,23,61], generating pseudo labels [17,34,46,63,66], and etc. In addition, some advanced start-of-the-art methods combine serveral techniques, e.g., DivideMix [30] and ELR+ [37].…”
Section: Learning With Noisy Labelsmentioning
confidence: 99%