2019
DOI: 10.48550/arxiv.1912.07806
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Joint Architecture and Knowledge Distillation in CNN for Chinese Text Recognition

Abstract: The technique of distillation helps transform cumbersome neural network into compact network so that the model can be deployed on alternative hardware devices. The main advantages of distillation based approaches include simple training process, supported by most off-the-shelf deep learning softwares and no special requirement of hardwares. In this paper, we propose a guideline to distill the architecture and knowledge of pre-trained standard CNNs simultaneously. We first make a quantitative analysis of the ba… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 60 publications
(78 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?