2021
DOI: 10.48550/arxiv.2108.11939
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Understanding and Accelerating Neural Architecture Search with Training-Free and Theory-Grounded Metrics

Abstract: This work targets designing a principled and unified training-free framework for Neural Architecture Search (NAS), with high performance, low cost, and in-depth interpretation. NAS has been explosively studied to automate the discovery of top-performer neural networks, but suffers from heavy resource consumption and often incurs search bias due to truncated training or approximations. Recent NAS works [1], [2], [3] start to explore indicators that can predict a network's performance without training. However, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 53 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?