2021
DOI: 10.1145/3450975
|View full text |Cite
|
Sign up to set email alerts
|

HyperNOMAD

Abstract: The performance of deep neural networks is highly sensitive to the choice of the hyperparameters that define the structure of the network and the learning process. When facing a new application, tuning a deep neural network is a tedious and time-consuming process that is often described as a “dark art.” This explains the necessity of automating the calibration of these hyperparameters. Derivative-free optimization is a field that develops methods designed to optimize time-consuming functions without relying on… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 20 publications
(7 citation statements)
references
References 38 publications
0
7
0
Order By: Relevance
“…This section describes ∆-MADS, a hybrid algorithm that mixes the local search of HyperNOMAD [24,25] with the global exploration scheme of ∆-DOGS [15]. ∆-MADS is designed to solve derivativefree optimization problems formulated as follows:…”
Section: The ∆-Mads Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…This section describes ∆-MADS, a hybrid algorithm that mixes the local search of HyperNOMAD [24,25] with the global exploration scheme of ∆-DOGS [15]. ∆-MADS is designed to solve derivativefree optimization problems formulated as follows:…”
Section: The ∆-Mads Methodsmentioning
confidence: 99%
“…Bayesian methods offer a more sophisticated alternative by either constructing a model over the objective function f in the case of a Gaussian processes [30] or random forests [16], or over the distribution of the good and bad configurations in the case of tree parzen estimators [13], by using the previously evaluated points. Other approaches were tested such as reinforcement learning [10,32] which is successfully used to find the appropriate architecture of convolutional neural networks, and more recently the HyperNOMAD [24,25] software, based on the mesh adaptive direct search (MADS) algorithm [8], was able to yield good results when optimizing both the architecture and the training hyperparameters simultaneously. The main drawback of this software is its lack of global exploration strategy.…”
Section: Related Workmentioning
confidence: 99%
“…The open-source library HyperNOMAD [18,19] is designed as an adaptation of the NOMAD software [20] to optimize the hyperparameters of deep neural networks as formulated in (1). This package allows searching for both the archi-tecture and the convolutional network's training regime for a specific dataset.…”
Section: The Hypernomad Packagementioning
confidence: 99%
“…The properties of DFO methods explain their popularity in the context of HPO of deep neural networks where they are often included in specialized libraries such as Hyperopt [6] or Oríon [7]. Similarly, the HyperNOMAD toolbox [18,19] is developed as an adaptation of MADS to simultaneously optimize the architecture and the training phase of a CNN for a given dataset as expressed in (1). This open source library was shown to have a competitive performance against other popular approaches such as Bayesian optimization [4] or a random search [5] on the MNIST [21], Fashion-MNIST [29] and datasets.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation