2022
DOI: 10.1109/tpds.2022.3140681
|View full text |Cite
|
Sign up to set email alerts
|

Building High-throughput Neural Architecture Search Workflows via a Decoupled Fitness Prediction Engine

Abstract: Neural networks (NN) are used in high-performance computing and high-throughput analysis to extract knowledge from datasets. Neural architecture search (NAS) automates NN design by generating, training, and analyzing thousands of NNs. However, NAS requires massive computational power for NN training. To address challenges of efficiency and scalability, we propose PENGUIN, a decoupled fitness prediction engine that informs the search without interfering in it. PENGUIN uses parametric modeling to predict fitness… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 32 publications
0
4
0
Order By: Relevance
“…IEEE Transactions on Parallel and Distributed Systems, 2022, In Press. DOI 10.1109/TPDS.2022.3140681 [12] …”
Section: Specifications Tablementioning
confidence: 99%
See 3 more Smart Citations
“…IEEE Transactions on Parallel and Distributed Systems, 2022, In Press. DOI 10.1109/TPDS.2022.3140681 [12] …”
Section: Specifications Tablementioning
confidence: 99%
“…Such tools include neural architecture search [6] , [7] , [8] and methods for NN fitness prediction and training termination [9] , [10] , [11] . Learning curve data is essential to the development of methods for NN fitness modeling and prediction [3] , [12] . Researchers can use our dataset to study evolution of NN fitness during training and identify relationships between an NN’s structure and its fitness on a given image dataset.…”
Section: Specifications Tablementioning
confidence: 99%
See 2 more Smart Citations