2013
DOI: 10.1007/978-3-319-00032-9_39
|View full text |Cite
|
Sign up to set email alerts
|

A MCMC Approach for Learning the Structure of Gaussian Acyclic Directed Mixed Graphs

Abstract: Graphical models are widely used to encode conditional independence constraints and causal assumptions, the directed acyclic graph (DAG) being one of the most common families of models. However, DAGs are not closed under marginalization: that is, if a distribution is Markov with respect to a DAG, several of its marginals might not be representable with another DAG unless one discards some of the structural independencies. Acyclic directed mixed graphs (ADMGs) generalize DAGs so that closure under marginalizati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
15
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 7 publications
(16 citation statements)
references
References 11 publications
1
15
0
Order By: Relevance
“…Figures 3 and 4 provide some illustration of the behavior of Bayesian projections. In Figures 3, our implied prior over model structures provides more diffuse posteriors than the one adopted by the more traditional Bayesian approach introduced by [20]. Both posteriors still centered close to the right model for data sets of size 10000, but Bayesian projections does still have a somewhat broad posterior, which in some sense reflects the greater insensitivity of the Frobenius norm compared to the likelihood function of the Gibbs procedure.…”
Section: Resultsmentioning
confidence: 91%
See 2 more Smart Citations
“…Figures 3 and 4 provide some illustration of the behavior of Bayesian projections. In Figures 3, our implied prior over model structures provides more diffuse posteriors than the one adopted by the more traditional Bayesian approach introduced by [20]. Both posteriors still centered close to the right model for data sets of size 10000, but Bayesian projections does still have a somewhat broad posterior, which in some sense reflects the greater insensitivity of the Frobenius norm compared to the likelihood function of the Gibbs procedure.…”
Section: Resultsmentioning
confidence: 91%
“…In [20], we introduce a Gibbs sampler for the graphical structure G in the Gaussian case. [23] introduces a new variation of the idea, where the graphical structure does not encode hard constraints: instead each edge represents a mixture indicator, with the lack of an edge representing a prior for σ ij strongly concentrated around zero, and the presence of an edge as indicating a high variance prior.…”
Section: Case Study Ii: Gaussian Marginal Independence Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to overcome this problem, Wang () proposed a method to perform covariance model selection with improved computational efficiency. On the other hand, Silva and Kalaitzis () developed an approach to improve the efficiency of MCMC algorithms used to perform Bayesian inference and showed its application in covariance model selection, and Silva () proposed a method based on acyclic directed mixed graphs (a generalization of directed acyclic graphs) that can be used to estimate the covariance matrix when the pattern of zeros is unknown. Some of these methods could be implemented in genome‐wide prediction following approaches similar to those presented in this study.…”
Section: Discussionmentioning
confidence: 99%
“…There are several contributions that propose Bayesian methods in related contexts. In [28], the author proposes a method to perform full Bayesian inference for acyclic directed mixed graphs using Markov chain Monte Carlo (MCMC) methods. There are several similarities between the model there (based upon structural equation models (SEM) e.g .…”
Section: Introductionmentioning
confidence: 99%