Proceedings of the 3rd International Conference on Applications of Intelligent Systems 2020
DOI: 10.1145/3378184.3378193
|View full text |Cite
|
Sign up to set email alerts
|

Confidence Bound Minimization for Bayesian optimization with Student's-t Processes

Abstract: Bayesian optimization seeks the global optimum of a black-box, objective function f (x), in the fewest possible iterations. Recent work applied knowledge of the true value of the optimum to the Gaussian Process probabilistic model typically used in Bayesian optimization. This, together with a new acquisition function called Confidence Bound Minimization, resulted in a Gaussian probabilistic posterior in which the predictions were no greater than the known maximum (and no less than for minimum). Our novel work … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 7 publications
0
3
0
Order By: Relevance
“…Our work enhances Bayesian optimization by comparing STP ( = 5) ERM versus GP ERM [9]. The data is split 85/15 between training and testing [18]. 3-fold cross-validation of the XGBoost classifier is averaged to measure [9,18].…”
Section: Application: Xgboost Hyperparameter Tuningmentioning
confidence: 99%
See 2 more Smart Citations
“…Our work enhances Bayesian optimization by comparing STP ( = 5) ERM versus GP ERM [9]. The data is split 85/15 between training and testing [18]. 3-fold cross-validation of the XGBoost classifier is averaged to measure [9,18].…”
Section: Application: Xgboost Hyperparameter Tuningmentioning
confidence: 99%
“…The data is split 85/15 between training and testing [18]. 3-fold cross-validation of the XGBoost classifier is averaged to measure [9,18]. We use a logistic objective function [9,18], with 5 random initializations [19] and = 30 post-initialization iterations [9,18].…”
Section: Application: Xgboost Hyperparameter Tuningmentioning
confidence: 99%
See 1 more Smart Citation