2018
DOI: 10.48550/arxiv.1808.01097
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

CurriculumNet: Weakly Supervised Learning from Large-Scale Web Images

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 15 publications
0
7
0
Order By: Relevance
“…The idea of curriculum learning was originally proposed in [1], it demonstrates that the strategy of learning from easy to hard significantly improves the generalization of the deep model. Up to now, works been done via curriculum learning mainly focus on visual category discovery [29,41], object tracking [47], semi-/weaklysupervised learning [11,12,23,40], etc. [40] proposed an approach that processes multiple tasks in a sequence with sharing between subsequent tasks instead of solving all tasks jointly by finding the best order of tasks to be learned.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The idea of curriculum learning was originally proposed in [1], it demonstrates that the strategy of learning from easy to hard significantly improves the generalization of the deep model. Up to now, works been done via curriculum learning mainly focus on visual category discovery [29,41], object tracking [47], semi-/weaklysupervised learning [11,12,23,40], etc. [40] proposed an approach that processes multiple tasks in a sequence with sharing between subsequent tasks instead of solving all tasks jointly by finding the best order of tasks to be learned.…”
Section: Related Workmentioning
confidence: 99%
“…Very few works approach the imbalanced learning. Guo et al [12] developed a principled learning strategy by leveraging curriculum learning in a weakly supervised framework, with the goal of effectively learning from imbalanced data.…”
Section: Related Workmentioning
confidence: 99%
“…With synthetic noisy labeled data, Rolnick et al [28] demonstrate that deep learning is robust to noise when training data is sufficiently large with large batch size and proper learning rate. Guo et al [6] develop a curriculum training scheme to learn noisy data from easy to hard. Jiang et al [10] design a MentorNet to adjust the loss weights of noisy samples in the training process.…”
Section: Noisy Data Learning Methodsmentioning
confidence: 99%
“…With the same full dataset, our guidance learning framework further improves the teacher model by 2.08%. As another useful trick in noisy data learning [25,6], fine-tuning the model trained with noisy data on clean set further boosts the final performance.This trick improves our guidance learning from 68.86% to 71.4%, and boosts the model#3 from 66.78% to 68.6% .…”
Section: Exploration Of Guidance Learning Onmentioning
confidence: 99%
See 1 more Smart Citation