2020
DOI: 10.1609/aaai.v34i04.5971
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Optimization for Categorical and Category-Specific Continuous Inputs

Abstract: Many real-world functions are defined over both categorical and category-specific continuous variables and thus cannot be optimized by traditional Bayesian optimization (BO) methods. To optimize such functions, we propose a new method that formulates the problem as a multi-armed bandit problem, wherein each category corresponds to an arm with its reward distribution centered around the optimum of the objective function in continuous variables. Our goal is to identify the best arm and the maximizer of the corre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
1
1

Relationship

2
7

Authors

Journals

citations
Cited by 35 publications
(36 citation statements)
references
References 10 publications
0
36
0
Order By: Relevance
“…It makes use of a mix of multi-armed bandits and Gaussian processes. Similar ideas are utilised in [22,28]. Another interesting new research direction is to combine the advantages of Gaussian processes and artificial neural networks [17], although more research is required to make this computationally feasible for larger problems.…”
Section: Related Workmentioning
confidence: 99%
“…It makes use of a mix of multi-armed bandits and Gaussian processes. Similar ideas are utilised in [22,28]. Another interesting new research direction is to combine the advantages of Gaussian processes and artificial neural networks [17], although more research is required to make this computationally feasible for larger problems.…”
Section: Related Workmentioning
confidence: 99%
“…Linear models [3], Gaussian processes with kernels over discrete structures [46,15,24,36], and random forest [30] are considered for surrogate modeling. Search strategies include mathematical optimization [3,13,15], heuristic search methods [30,46,14], and combination thereof [16,66,44]. [31] provided a BO approach for tree structured spaces.…”
Section: Kernels Over Structured Datamentioning
confidence: 99%
“…The key technical challenge is that existing BO approaches [9,20,42] cannot be naively adapted to explanation generation. In the hyperparameter tuning setting, categorical variables typically have very low cardinality (e.g., with 2-3 distinct values [30]). In the query explanation setting, however, a categorical variable can have many more distinct values.…”
Section: Sql Explainmentioning
confidence: 99%
“…However, it does not scale well to variables with many distinct values [35]. BO may use tree-based surrogate models (e.g., random forests [20], tree Parzen estimators [9]) to handle categorical variables, however their predictive accuracy is empirically poor [16,30]. Other work optimizes a combinatorial search space [5,12,31], and categorical/category-specific continuous variables [30].…”
Section: Related Workmentioning
confidence: 99%