2023
DOI: 10.1049/cvi2.12248
|View full text |Cite
|
Sign up to set email alerts
|

Improving neural ordinary differential equations via knowledge distillation

Haoyu Chu,
Shikui Wei,
Qiming Lu
et al.

Abstract: Neural ordinary differential equations (ODEs) (Neural ODEs) construct the continuous dynamics of hidden units using ODEs specified by a neural network, demonstrating promising results on many tasks. However, Neural ODEs still do not perform well on image recognition tasks. The possible reason is that the one‐hot encoding vector commonly used in Neural ODEs can not provide enough supervised information. A new training based on knowledge distillation is proposed to construct more powerful and robust Neural ODEs … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 49 publications
(69 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?