2024
DOI: 10.1109/tpami.2023.3328347
|View full text |Cite
|
Sign up to set email alerts
|

Understanding and Accelerating Neural Architecture Search With Training-Free and Theory-Grounded Metrics

Wuyang Chen,
Xinyu Gong,
Junru Wu
et al.

Abstract: This work targets designing a principled and unified training-free framework for Neural Architecture Search (NAS), with high performance, low cost, and in-depth interpretation. NAS has been explosively studied to automate the discovery of top-performer neural networks, but suffers from heavy resource consumption and often incurs search bias due to truncated training or approximations. Recent NAS works [1], [2], [3] start to explore indicators that can predict a network's performance without training. However, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
references
References 74 publications
0
0
0
Order By: Relevance