2018
DOI: 10.1007/978-3-030-00374-6_5
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Optimization of the PC Algorithm for Learning Gaussian Bayesian Networks

Abstract: The PC algorithm is a popular method for learning the structure of Gaussian Bayesian networks. It carries out statistical tests to determine absent edges in the network. It is hence governed by two parameters: (i) The type of test, and (ii) its significance level. These parameters are usually set to values recommended by an expert. Nevertheless, such an approach can suffer from human bias, leading to suboptimal reconstruction results. In this paper we consider a more principled approach for choosing these para… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2
2

Relationship

3
5

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 12 publications
0
7
0
Order By: Relevance
“…3 Black-Box Bayesian Optimization BO has been used in a plethora of scenarios: From Hyperparameter Tuning of ML Algorithms [Snoek et al, 2012] [Córdoba et al, 2018] to renewable energies [Cornejo-Bueno et al, 2018] and a wide variety of applications [Shahriari et al, 2016]. BO process follows an iterative scheme where it uses a probabilistic model as a surrogate model, typically a Gaussian Process (GP) which is a prior over functions.…”
Section: Related Workmentioning
confidence: 99%
“…3 Black-Box Bayesian Optimization BO has been used in a plethora of scenarios: From Hyperparameter Tuning of ML Algorithms [Snoek et al, 2012] [Córdoba et al, 2018] to renewable energies [Cornejo-Bueno et al, 2018] and a wide variety of applications [Shahriari et al, 2016]. BO process follows an iterative scheme where it uses a probabilistic model as a surrogate model, typically a Gaussian Process (GP) which is a prior over functions.…”
Section: Related Workmentioning
confidence: 99%
“…For example, the estimation of the generalization error of ML algorithms is considered to be a black-box function. We find other applications in structure learning of probabilistic graphical models [7], astrophysics [3] or even subjective tasks as suggesting better recipes [11].…”
Section: Introductionmentioning
confidence: 99%
“…The estimation of the generalization error of ML algorithms is considered to be a black-box function. We find other applications in structure learning of probabilistic graphical models [2] or even subjective tasks as suggesting better recipes [4]. When not only one but several black-boxes are optimized, we deal with the Multi-objective BO scenario [9].…”
Section: Introductionmentioning
confidence: 99%