2019
DOI: 10.48550/arxiv.1912.07651
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

UNAS: Differentiable Architecture Search Meets Reinforcement Learning

Abstract: Neural architecture search (NAS) aims to discover network architectures with desired properties such as high accuracy or low latency. Recently, differentiable NAS (DNAS) has demonstrated promising results while maintaining a search cost orders of magnitude lower than reinforcement learning (RL) based NAS. However, DNAS models can only optimize differentiable loss functions in search, and they require an accurate differentiable approximation of non-differentiable criteria. In this work, we present UNAS, a unifi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 20 publications
0
1
0
Order By: Relevance
“…For example, while EfficientNets vastly outperform ResNets in terms of theoretical training efficiency, they have often been found to underperform when considering practical training efficiency on GPUs (Lee et al, 2020). Some recent work has used NAS to optimise practical efficiency on GPUs (Cai et al, 2018;Vahdat et al, 2019;Lin et al, 2020). For the presented work, we prioritised hang-engineered solutions, but do not rule out NAS methods in future work.…”
Section: Efficient Cnnsmentioning
confidence: 99%
“…For example, while EfficientNets vastly outperform ResNets in terms of theoretical training efficiency, they have often been found to underperform when considering practical training efficiency on GPUs (Lee et al, 2020). Some recent work has used NAS to optimise practical efficiency on GPUs (Cai et al, 2018;Vahdat et al, 2019;Lin et al, 2020). For the presented work, we prioritised hang-engineered solutions, but do not rule out NAS methods in future work.…”
Section: Efficient Cnnsmentioning
confidence: 99%