2021
DOI: 10.48550/arxiv.2103.00053
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge Distillation

Abstract: We propose a novel knowledge distillation methodology for compressing deep neural networks. One of the most efficient methods for knowledge distillation is hint distillation, where the student model is injected with information (hints) from several different layers of the teacher model. Although the selection of hint points can drastically alter the compression performance, there is no systematic approach for selecting them, other than brute-force hyper-parameter search. We propose a clustering based hint sele… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 23 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?