2022
DOI: 10.1109/tevc.2022.3147526
|View full text |Cite
|
Sign up to set email alerts
|

BenchENAS: A Benchmarking Platform for Evolutionary Neural Architecture Search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(3 citation statements)
references
References 36 publications
0
2
0
Order By: Relevance
“…The batch size was set to 16 and the model was trained for 200 epochs. MultiStepLR is a learning rate decay method used in PyTorch [86] that adjusts the learning rate at set intervals [87]. The initial learning rate was set to 0.01.…”
Section: Training Methodsmentioning
confidence: 99%
“…The batch size was set to 16 and the model was trained for 200 epochs. MultiStepLR is a learning rate decay method used in PyTorch [86] that adjusts the learning rate at set intervals [87]. The initial learning rate was set to 0.01.…”
Section: Training Methodsmentioning
confidence: 99%
“…The evaluation of each individual in the algorithm is independent, so the population-based optimization algorithm easily performs parallel computing. Based on this feature, Xie et al [56] built the BenchENAS platform. When evaluating individual fitness, individuals in the population can be evaluated parallelly in a common lab environment, which significantly speeds up population evaluation and promotes the development of ENAS.…”
Section: Introductionmentioning
confidence: 99%
“…Even more, the RegularizedEvo algorithm [9] ran on 450 GPUs for 7 days. In practice, buying or renting such scales of GPU resources is commonly unaffordable for most researchers [19]. As a result, how to accelerate the TEM to reduce the prohibitive computational overhead is essential, which results in the research topic of EEMs in the NAS community [11].…”
Section: Introductionmentioning
confidence: 99%