2015
DOI: 10.1002/sta4.97
|View full text |Cite
|
Sign up to set email alerts
|

Parallel Markov chain Monte Carlo for non‐Gaussian posterior distributions

Abstract: Recent developments in big data and analytics research have produced an abundance of large data sets that are too big to be analyzed in their entirety, due to limits on computer memory or storage capacity. To address these issues, communication-free parallel Since our method estimates only marginal densities, there is no limitation on the number of model parameters analyzed. Our procedure is suitable for Bayesian models with unknown parameters with fixed dimension in continuous parameter spaces.3

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
16
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5

Relationship

3
2

Authors

Journals

citations
Cited by 11 publications
(16 citation statements)
references
References 23 publications
0
16
0
Order By: Relevance
“…By assuming that the subposterior densities are independent, the subposterior samples are combined to estimate the full data posterior density by the following (Scott et al., ; Neiswanger et al., ; Wang & Dunson, ; Miroshnikov et al., ): p()bold-italicγ1em|1embold-italicX2.56804pt0.3emp()bold-italicX1em|1embold-italicγp()bold-italicγtruem=1Mp()bold-italicXm1em|1embold-italicγp()bold-italicγ1/M=truem=1Mpm()bold-italicγ1em|1embold-italicXm. …”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…By assuming that the subposterior densities are independent, the subposterior samples are combined to estimate the full data posterior density by the following (Scott et al., ; Neiswanger et al., ; Wang & Dunson, ; Miroshnikov et al., ): p()bold-italicγ1em|1embold-italicX2.56804pt0.3emp()bold-italicX1em|1embold-italicγp()bold-italicγtruem=1Mp()bold-italicXm1em|1embold-italicγp()bold-italicγ1/M=truem=1Mpm()bold-italicγ1em|1embold-italicXm. …”
Section: Methodsmentioning
confidence: 99%
“…There are many recent techniques for combining subset MCMC samples, including sample averaging, weighted averaging (consensus Monte Carlo; Scott et al., ), kernel smoothing (Neiswanger et al ), and Weierstrass sampling (Wang and Dunson, ). These methods work well when the posterior distribution is Gaussian; however, these methods do not perform as well when the posterior is non‐Gaussian (Miroshnikov et al ). Recently, a direct density product method was introduced for non‐Gaussian posteriors (Miroshnikov et al ).…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…To address these issues, numerous alternative communication-free parallel MCMC methods have been developed for Bayesian analysis of big data. These methods partition data into subsets, perform independent Bayesian MCMC analysis on each subset, and combine the subset posterior samples to estimate the full data posterior; see [23,19,17].…”
Section: Introductionmentioning
confidence: 99%