2022 26th International Conference on Pattern Recognition (ICPR) 2022
DOI: 10.1109/icpr56361.2022.9956061
|View full text |Cite
|
Sign up to set email alerts
|

CES-KD: Curriculum-based Expert Selection for Guided Knowledge Distillation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 23 publications
0
1
0
Order By: Relevance
“…In this section, we first compared CSKD with some previous methods, including KD, 12 FitNet, 34 AT, 35 SP, 36 VID, 38 HKD, 39 MGD, 41 CRD, 40 virtual knowledge distillation (VKD), 42 curriculum expert selection for knowledge distillation (CESKD), 43 and multilevel attention-based sample correlations for knowledge distillation (MASCKD), 44 on benchmark datasets to verify its effectiveness. Then, we conducted representational transferability experiments to evaluate the quality of representations learned by the student network.…”
Section: Methodsmentioning
confidence: 99%
“…In this section, we first compared CSKD with some previous methods, including KD, 12 FitNet, 34 AT, 35 SP, 36 VID, 38 HKD, 39 MGD, 41 CRD, 40 virtual knowledge distillation (VKD), 42 curriculum expert selection for knowledge distillation (CESKD), 43 and multilevel attention-based sample correlations for knowledge distillation (MASCKD), 44 on benchmark datasets to verify its effectiveness. Then, we conducted representational transferability experiments to evaluate the quality of representations learned by the student network.…”
Section: Methodsmentioning
confidence: 99%