2015
DOI: 10.1080/01621459.2014.896806
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Inference of Multiple Gaussian Graphical Models

Abstract: In this paper, we propose a Bayesian approach to inference on multiple Gaussian graphical models. Specifically, we address the problem of inferring multiple undirected networks in situations where some of the networks may be unrelated, while others share common features. We link the estimation of the graph structures via a Markov random field (MRF) prior which encourages common edges. We learn which sample groups have a shared graph structure by placing a spike-and-slab prior on the parameters that measure net… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

4
215
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 163 publications
(219 citation statements)
references
References 36 publications
4
215
0
Order By: Relevance
“…Finally, ignoring the weights W kk′ in (4), the Laplacian shrinkage penalty resembles the Markov random field (MRF) prior used in Bayesian variable selection with structured covariates [16]. While our paper was under review, we became aware of the recent work by Peterson et al [23], who utilize an MRF prior to develop a Bayesian framework for estimation of multiple Gaussian graphical models. This method assumes that edges between pairs of random variable are formed independently, and is hence more suited for Erdős-Rényi networks.…”
Section: Model and Estimatormentioning
confidence: 99%
“…Finally, ignoring the weights W kk′ in (4), the Laplacian shrinkage penalty resembles the Markov random field (MRF) prior used in Bayesian variable selection with structured covariates [16]. While our paper was under review, we became aware of the recent work by Peterson et al [23], who utilize an MRF prior to develop a Bayesian framework for estimation of multiple Gaussian graphical models. This method assumes that edges between pairs of random variable are formed independently, and is hence more suited for Erdős-Rényi networks.…”
Section: Model and Estimatormentioning
confidence: 99%
“…Peterson et al and a few others (Carvalho and West, 2007;Carvalho et al, 2007a,b;Carvalho and Scott, 2009;Mitsakakis et al, 2011;Wang and Li, 2012;Lehermeier et al, 2013;Peterson et al, 2014) have provided detailed MCMC for G-Wishart distributions in Gaussian graphical models settings. We utilized similar sampling procedures in our hierarchical Bayesian multivariate state space model for network constructions.…”
Section: Hierarchical Bayesian Multivariate State Space Models and Momentioning
confidence: 99%
“…The advantage of using precision matrix instead of covariance matrix is that the latter is simpler to write but running in MCMC very slow. ρ t > 2 is the degrees of freedom, R t is P × P symmetric positive definite scale matrices (Carvalho and West, 2007;Carvalho et al, 2007a;Peterson et al, 2014). Note that the prior distribution of multivariate normal assumption could be modified into conjugate multivariate t-distribution and the hyper-prior with inverse Gamma distribution, which is heavier tailed and thus more robust for noisy expression data (Finegold and Drton, 2014).…”
Section: Hierarchical Bayesian Multivariate State Space Models and Momentioning
confidence: 99%
See 2 more Smart Citations