2021
DOI: 10.48550/arxiv.2109.05319
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

HyP-ABC: A Novel Automated Hyper-Parameter Tuning Algorithm Using Evolutionary Optimization

Abstract: Machine learning techniques lend themselves as promising decision-making and analytic tools in a wide range of applications. Different ML algorithms have various hyperparameters. In order to tailor an ML model towards a specific application, a large number of hyper-parameters should be tuned. Tuning the hyper-parameters directly affects the performance (accuracy and run-time). However, for large-scale search spaces, efficiently exploring the ample number of combinations of hyperparameters is computationally ch… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 8 publications
0
1
0
Order By: Relevance
“…For example, when performing a multi-classification task, the number of output nodes must be the same as the type of classification. Second, initialize the population and encode the hyperparameters (learning rate, number of iterations, number of training rounds, number of hidden layer units, number of hidden layer layers) for each individual of the population [31,32]. Third, calculate the fitness value of each individual through the loss function and fitness function, and sort them according to the size of the fitness value, and select the optimal top 5% and the top 95%.…”
Section: Improved Genetic Neural Networkmentioning
confidence: 99%
“…For example, when performing a multi-classification task, the number of output nodes must be the same as the type of classification. Second, initialize the population and encode the hyperparameters (learning rate, number of iterations, number of training rounds, number of hidden layer units, number of hidden layer layers) for each individual of the population [31,32]. Third, calculate the fitness value of each individual through the loss function and fitness function, and sort them according to the size of the fitness value, and select the optimal top 5% and the top 95%.…”
Section: Improved Genetic Neural Networkmentioning
confidence: 99%