2022
DOI: 10.1007/s43069-022-00128-w
|View full text |Cite
|
Sign up to set email alerts
|

Use of Static Surrogates in Hyperparameter Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 12 publications
0
1
0
Order By: Relevance
“…The problem mentioned above can be handled by adjusting the hyper-parameters of the neural network intelligently, including the static and dynamic optimization neural network (SONN and DONN). The first one adjusts the hyper-parameters outside the iterative process of the neural network; the second dynamically adjusts the hyper-parameters inside the iterative process. , The typical static optimization techniques are grid search and evolution strategies. These techniques can explore different neural network hyper-parameters, but slowly. On the other hand, gradient methods and neural network structure adjustment methods are two categories of dynamic optimization techniques .…”
Section: Introductionmentioning
confidence: 99%
“…The problem mentioned above can be handled by adjusting the hyper-parameters of the neural network intelligently, including the static and dynamic optimization neural network (SONN and DONN). The first one adjusts the hyper-parameters outside the iterative process of the neural network; the second dynamically adjusts the hyper-parameters inside the iterative process. , The typical static optimization techniques are grid search and evolution strategies. These techniques can explore different neural network hyper-parameters, but slowly. On the other hand, gradient methods and neural network structure adjustment methods are two categories of dynamic optimization techniques .…”
Section: Introductionmentioning
confidence: 99%