2019
DOI: 10.1016/j.cor.2018.01.013
|View full text |Cite
|
Sign up to set email alerts
|

Tuning hyperparameters of a SVM-based water demand forecasting system through parallel global optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
39
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 72 publications
(39 citation statements)
references
References 16 publications
0
39
0
Order By: Relevance
“…According to the emerging interest on BO, to solve black-box and expensive optimization processes [3] [4], we have proposed the BO as a global optimization algorithm for the estimation of a promising starting model for FWI. We considered two alternative acquisition functions for BO and test them on a 2D acoustic FWI benchmark problem, namely the Marmousi model.…”
Section: Discussionmentioning
confidence: 99%
“…According to the emerging interest on BO, to solve black-box and expensive optimization processes [3] [4], we have proposed the BO as a global optimization algorithm for the estimation of a promising starting model for FWI. We considered two alternative acquisition functions for BO and test them on a 2D acoustic FWI benchmark problem, namely the Marmousi model.…”
Section: Discussionmentioning
confidence: 99%
“…A hyperparameter is a parameter whose esteem is utilized to control the learning procedure. A similar sort of machine-learning model can require distinctive limitations, loads or learning rates to sum up various information designs [20]. These measures are called hyperparameters and must be tuned with the goal that the model can ideally tackle the machine-learning issue.…”
Section: Hyper-tuningmentioning
confidence: 99%
“…For SVC, the optimum performance is achieved by tuning C, gamma and kernel [20]. After tuning, these parameters are found as: 'C': 1, 'gamma': 0.0001, 'kernel': 'rbf' .…”
Section: Support Vector Machinementioning
confidence: 99%
“…Recently, Bayesian optimization (BO) (Shahriari et al 2015;Frazier 2018) is becoming one of the most widely adopted strategies for global optimization of multi-extremal, and expensive-to-evaluate objective functions related to, e.g., sensor networks (Garnett et al 2010), drug design (Meldgaard et al 2018), time-series forecasting (Candelieri et al 2018a), inversion problems (Perdikaris and Karniadakis 2016; Galuzzi et al 2018), and robotics (Olofsson et al 2018).…”
Section: Introductionmentioning
confidence: 99%