2022
DOI: 10.48550/arxiv.2201.10836
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

PARS: Pseudo-Label Aware Robust Sample Selection for Learning with Noisy Labels

Abstract: Acquiring accurate labels on large-scale datasets is both time consuming and expensive. To reduce the dependency of deep learning models on learning from clean labeled data, several recent research efforts are focused on learning with noisy labels. These methods typically fall into three design categories to learn a noise robust model: sample selection approaches, noise robust loss functions, or label correction methods. In this paper, we propose PARS: Pseudo-Label Aware Robust Sample Selection, a hybrid appro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…A major issue for ST approaches is confirmation bias, where the student model would accumulate errors from the teacher model when learning with inaccurate pseudo-labels (e.g. Goel et al, 2022;.…”
Section: Related Workmentioning
confidence: 99%
“…A major issue for ST approaches is confirmation bias, where the student model would accumulate errors from the teacher model when learning with inaccurate pseudo-labels (e.g. Goel et al, 2022;.…”
Section: Related Workmentioning
confidence: 99%
“…Employing object detection methods in SSOD poses several potential challenges that must be carefully dealt with to obtain reasonable performance. These factors include overfitting of the labeled data [39], pseudo-label noise [11], bias induced through label imbalance [18,33], and poor detection performance on small objects [58]. Recently, DETRbased [2,17,19,26,40,57,61] SSOD methods [48,58] remove the need for traditional components like NMS.…”
Section: Introductionmentioning
confidence: 99%