2021 IEEE/CVF International Conference on Computer Vision (ICCV) 2021
DOI: 10.1109/iccv48922.2021.00502
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive Curriculum Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 15 publications
(6 citation statements)
references
References 24 publications
0
6
0
Order By: Relevance
“…This is further improved in [56] by encouraging more exploration during early phases of learning. More recently, [18] propose a curriculum which computes exponential moving averages of loss values as difficulty scores for training samples.…”
Section: Related Workmentioning
confidence: 99%
“…This is further improved in [56] by encouraging more exploration during early phases of learning. More recently, [18] propose a curriculum which computes exponential moving averages of loss values as difficulty scores for training samples.…”
Section: Related Workmentioning
confidence: 99%
“…We use scores based on pretrained models (Zhang and Bansal, 2019; to compute difficulty. Kong et al (2021) and Cai et al (2020) demonstrate that an adaptive curriculum can improve convergence for image classification and neural response generation respectively. We examine adaptive CL for commonsense reasoning.…”
Section: Related Workmentioning
confidence: 99%
“…When the optimal curriculum for a student is not known in advance, a teacher usually draws up a curriculum based on past teaching experience and then adjusts with the learning progress of the student. Accordingly, Kong et al (2021) propose initializing the curriculum using the difficulty score obtained from the teacher model and then adapt the score to the current state of the learner model (see Fig. 1).…”
Section: Adaptive Curriculum Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…The latest literature (Delange et al 2021) proposed a series of guidelines for continual learning methods to be applicable in practice: i) good performance and less forgetting on previous tasks; ii) no oracle of task identifiers at inference time; and iii) bounded memory footprint throughout the entire training phase. Unfortunately, most of the exiting methods fail to satisfy all these guidelines mentioned above (Chaudhry et al 2021;Wang et al 2020;Lopez-Paz et al 2017;Wei et al 2021;Riemer et al 2019;Kong et al 2021;Schwarz et al 2018;Zhu et al 2020).…”
Section: Introductionmentioning
confidence: 99%