2019 Joint IEEE 9th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob) 2019
DOI: 10.1109/devlrn.2019.8850689
|View full text |Cite
|
Sign up to set email alerts
|

Training-ValueNet: Data Driven Label Noise Cleaning on Weakly-Supervised Web Images

Abstract: Manually labelling new datasets for image classification remains expensive and time-consuming. A promising alternative is to utilize the abundance of images on the web for which search queries or surrounding text offers a natural source of weak supervision. Unfortunately the label noise in these datasets has limited their use in practice. Several methods have been proposed for performing unsupervised label noise cleaning, the majority of which use outlier detection to identify and remove mislabeled images. In … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
7
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(8 citation statements)
references
References 13 publications
0
7
0
1
Order By: Relevance
“…Noisy labels are posing a non-trivial problem in deep model learning when an increasing ability to fit noise is accompanied with deeper layers. Given the ubiquity and importance of coping with noisy labeling, many works have been devoted to combat this problem [4,5,17,18,19]. One promising direction is to learn from a small set of clean labeled data and then use them to update the network [17,4,19].…”
Section: Learning From Noisy Labelsmentioning
confidence: 99%
See 1 more Smart Citation
“…Noisy labels are posing a non-trivial problem in deep model learning when an increasing ability to fit noise is accompanied with deeper layers. Given the ubiquity and importance of coping with noisy labeling, many works have been devoted to combat this problem [4,5,17,18,19]. One promising direction is to learn from a small set of clean labeled data and then use them to update the network [17,4,19].…”
Section: Learning From Noisy Labelsmentioning
confidence: 99%
“…One promising direction is to learn from a small set of clean labeled data and then use them to update the network [17,4,19]. Another direction is to design models that could learn directly with noisy labels [17,18,20]. In [4], an auxiliary model is trained with a small but clean data set, which was manually labeled by human experts.…”
Section: Learning From Noisy Labelsmentioning
confidence: 99%
“…Additionally, labels are only provided per patient and not per voxel, which could introduce labeling noise as spectra from the tumor-affected hemisphere can be falsely labeled as "tumor" even though they contain healthy brain tissue. Given the ubiquity and importance of coping with noisy labeling, many works on this topic have been published [8][9][10][11]. Starting learning from a small set of expert validated labels is one promising direction [8,10].…”
Section: Introductionmentioning
confidence: 99%
“…Starting learning from a small set of expert validated labels is one promising direction [8,10]. Another direction is to design models that learn directly with noisy labels [10,11]. For example, [10] uses a co-teaching framework where two DNNs were trained simultaneously with noisy labeling, or [11] discards samples that contribute negatively to the training performance.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation