2023
DOI: 10.1109/tpami.2022.3153065
|View full text |Cite
|
Sign up to set email alerts
|

Cyclic Differentiable Architecture Search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 28 publications
(6 citation statements)
references
References 72 publications
0
6
0
Order By: Relevance
“…GDAS guided the search toward architectures with lower validation loss. Addressing the problem that NAS algorithms are computationally inefficient, CDARTS, which stands for a Cyclic Differentiable Architecture Search [90], developed a two-step optimization to solve the problem. In every iteration of the algorithm, both the architecture and weights of the model are being trained: first, the weights of the model are trained; and then, the hyperparameters are evaluated to optimize the architecture.…”
Section: Methodsmentioning
confidence: 99%
“…GDAS guided the search toward architectures with lower validation loss. Addressing the problem that NAS algorithms are computationally inefficient, CDARTS, which stands for a Cyclic Differentiable Architecture Search [90], developed a two-step optimization to solve the problem. In every iteration of the algorithm, both the architecture and weights of the model are being trained: first, the weights of the model are trained; and then, the hyperparameters are evaluated to optimize the architecture.…”
Section: Methodsmentioning
confidence: 99%
“…He et al [54] proposed a mix-level objective function and also used a model size-based search strategy combined with an early stopping strategy, successfully improved the performance of the final architecture. Different improvements have also been made explicitly by other researchers to alleviate the problems present in DARTS [55][56][57][58].…”
Section: Related Workmentioning
confidence: 99%
“…However, the use of these methods that are explicit, such as an early stopping strategy, incorporating an attention mechanism, or modifying the composition of the search space, all add additional artificially set hyper-parameters. Therefore there are also many implicit approaches to increase the stability of architectural search, such as using architectural parameter regularization.PC-DARTS [44], FairDARTS [50], DOTS [53], CDARTS [55],…”
Section: Relationship With Previous Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Chen et al 19 proposed the idea of the progressive search strategy, which alleviates the adverse effects of the depth gap problem to some extent. Many different search strategies 20 – 23 have also been proposed to alleviate the adverse effects of the depth gap problem. In addition, the attention mechanism can help the neural network select useful features and discard the less-useful ones.…”
Section: Introductionmentioning
confidence: 99%