2021
DOI: 10.48550/arxiv.2104.14132
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Generalization Guarantees for Neural Architecture Search with Train-Validation Split

Abstract: Neural Architecture Search (NAS) is a popular method for automatically designing optimized architectures for high-performance deep learning. In this approach, it is common to use bilevel optimization where one optimizes the model weights over the training data (lower-level problem) and various hyperparameters such as the configuration of the architecture over the validation data (upper-level problem). This paper explores the statistical aspects of such problems with train-validation splits. In practice, the lo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 43 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?