2020
DOI: 10.3389/fnins.2020.00667
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Multi-objective Hyperparameter Optimization for Accurate, Fast, and Efficient Neural Network Accelerator Design

Abstract: In resource-constrained environments, such as low-power edge devices and smart sensors, deploying a fast, compact, and accurate intelligent system with minimum energy is indispensable. Embedding intelligence can be achieved using neural networks on neuromorphic hardware. Designing such networks would require determining several inherent hyperparameters. A key challenge is to find the optimum set of hyperparameters that might belong to the input/output encoding modules, the neural network itself, the applicatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3
1

Relationship

1
9

Authors

Journals

citations
Cited by 41 publications
(10 citation statements)
references
References 47 publications
0
10
0
Order By: Relevance
“…This opens up new horizons to not only focus on digital computing, but also to rethink using analogue, approximate and mixed-signal computing 118 , as biological neural computation itself is inherently analogue and stochastic. Among several approaches proposed in the literature on software-hardware co-design, one is using Bayesian optimization and Neural Architecture Search approaches in which several stacks of computing that range from materials and devices to algorithm and applications are codesigned to optimize overall system performance [119][120][121] . For example, in a memristive crossbar-based accelerator, an automatic codesign optimization approach has the opportunity to define the number and sizes of crossbars to optimize the accuracy and energy efficiency of the design for different applications or datasets.…”
Section: Discussionmentioning
confidence: 99%
“…This opens up new horizons to not only focus on digital computing, but also to rethink using analogue, approximate and mixed-signal computing 118 , as biological neural computation itself is inherently analogue and stochastic. Among several approaches proposed in the literature on software-hardware co-design, one is using Bayesian optimization and Neural Architecture Search approaches in which several stacks of computing that range from materials and devices to algorithm and applications are codesigned to optimize overall system performance [119][120][121] . For example, in a memristive crossbar-based accelerator, an automatic codesign optimization approach has the opportunity to define the number and sizes of crossbars to optimize the accuracy and energy efficiency of the design for different applications or datasets.…”
Section: Discussionmentioning
confidence: 99%
“…In contrast to the trainable parameters in the CNN that receive feedbacks from the loss by gradients (Methods), seeking the best combination of HPs is indeed a black box guessing problem that can only be found by trials (ad hoc approach). Random search 35 and Bayesian search 34,36 based on the gaussian process are methods that could help to refine the HPs.…”
Section: Resultsmentioning
confidence: 99%
“…The classical HPO problem is defined as λ * ∈ arg min λ∈ Λ GE(I, J , ρ, λ), i.e., the goal is to minimize the estimated generalization error when I (learner), J (resampling splits), and ρ (performance measure) are fixed, see [1] for further details. Instead of optimizing only for predictive performance, other metrics such as model sparsity or computational efficiency of prediction (e.g., MACs and FLOPs or model size and memory usage) could be included, resulting in a multiobjective HPO problem [37][38][39][40][41]. c(λ) is a black-box function, as it usually has no closed-form mathematical representation, and analytic gradient information is generally not available.…”
Section: Hyperparameter Optimizationmentioning
confidence: 99%