2019 IEEE 12th International Conference on Cloud Computing (CLOUD) 2019
DOI: 10.1109/cloud.2019.00097
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Deep Learning Hyperparameter Tuning Using Cloud Infrastructure: Intelligent Distributed Hyperparameter Tuning with Bayesian Optimization in the Cloud

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 38 publications
(15 citation statements)
references
References 1 publication
0
15
0
Order By: Relevance
“…Hyperparamater Tuning provides an optimal parameter search method for the proposed Convolutional Neural Network model. CNN combined with Hyperparameter Tuning requires to set the parameters, such as: kernel size, steps, number of channels, and number of dropouts [24]. Hyperparameter Tuning provides the best combination of parameters to the model demonstrating maximum results [25].…”
Section: Hyperparameter Tuningmentioning
confidence: 99%
“…Hyperparamater Tuning provides an optimal parameter search method for the proposed Convolutional Neural Network model. CNN combined with Hyperparameter Tuning requires to set the parameters, such as: kernel size, steps, number of channels, and number of dropouts [24]. Hyperparameter Tuning provides the best combination of parameters to the model demonstrating maximum results [25].…”
Section: Hyperparameter Tuningmentioning
confidence: 99%
“…3) Distributed Hyperparameter tuning: Since the begining when the use of neural networks was first introduced, the hyperparameter optimization or tuning has been essential to improve their performance [20]. It is also well known, that it is a tedious and slow process, for that reason several studies on distributing it and thus increase its performance have been carried out [21], [22]. Furthermore, it has also been proposed on medical image diagnosis [23], [24], [25], but in these studies their focus is not on efficiency.…”
Section: State Of the Artmentioning
confidence: 99%
“…The purpose of hyperparameter tuning is to get optimal hyperparameters [33]. The tuning hyperparameters in this study include the number of hidden units, dropouts, and learning rates.…”
Section: B Hyperparameter Tuning Random Searchmentioning
confidence: 99%