2022
DOI: 10.1145/3524500
|View full text |Cite
|
Sign up to set email alerts
|

Neural Architecture Search Survey: A Hardware Perspective

Abstract: We review the problem of automating hardware-aware architectural design process of Deep Neural Networks (DNNs). The field of Convolutional Neural Network (CNN) algorithm design has led to advancements in many fields such as computer vision, virtual reality, and autonomous driving. The end-to-end design process of a CNN is a challenging and time-consuming task as it requires expertise in multiple areas such as signal and image processing, neural networks, and optimization. At the same time, several hardware pla… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 39 publications
(11 citation statements)
references
References 75 publications
0
11
0
Order By: Relevance
“…It consists of exploring the search space of possible neural network architectures to optimize them according to one or several metrics, such as model accuracy, size, or computational complexity. Within NAS, several methods for traversing the vast search space of possible architectures have been proposed, including gradient-based methods [27], Reinforcement Learning [5], and Genetic Algorithms (GA). In particular, the latter has been adopted in computer vision as a way to find an optimal CNN structure for face recognition [24] and, more recently, achieving better performances than manuallyderived CNNs for image classification [23].…”
Section: Introductionmentioning
confidence: 99%
“…It consists of exploring the search space of possible neural network architectures to optimize them according to one or several metrics, such as model accuracy, size, or computational complexity. Within NAS, several methods for traversing the vast search space of possible architectures have been proposed, including gradient-based methods [27], Reinforcement Learning [5], and Genetic Algorithms (GA). In particular, the latter has been adopted in computer vision as a way to find an optimal CNN structure for face recognition [24] and, more recently, achieving better performances than manuallyderived CNNs for image classification [23].…”
Section: Introductionmentioning
confidence: 99%
“…Recently, pruning-at-initialization methods [4,5] have attracted a lot of interest as they can reduce the training cost while achieve similar accuracy. Neural Architecture Search (NAS) has become a popular approach for automatically discovering competitive neural network architectures, surpassing those designed by humans [2]. Existing methods [1,12] primarily focus on accuracy while neglecting other hardware-related factors, such as latency, energy, and memory.…”
Section: Introductionmentioning
confidence: 99%
“…With the rapid advancement of artificial intelligence (AI) technologies, the demand for constructing AI models has surged [1][2][3][4][5]. Concurrently, the endeavor of devising neural architectures within machine learning has emerged as a time-intensive and laborious task, mainly due to the inherent challenge of attaining optimal neural architectures solely through expert insights [1,5,6]. More precisely, the pursuit of optimal or sufficiently effective architectures frequently entails considerable time and effort.…”
Section: Introductionmentioning
confidence: 99%
“…More precisely, the pursuit of optimal or sufficiently effective architectures frequently entails considerable time and effort. The complexity stems from the inherent diversity of plausible neural architectures for a given task, necessitating recourse to a brute-force methodology involving training for each architecture to pinpoint the optimal configuration [1][2][3]5,6]. This backdrop has spurred a surge in AutoML research, particularly within the domain of Neural Architecture Search (NAS), which strives to streamline the exploration of efficient architectures [4,6].…”
Section: Introductionmentioning
confidence: 99%