2017
DOI: 10.1115/1.4037450
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Annealed Sequential Importance Sampling: An Unbiased Version of Transitional Markov Chain Monte Carlo

Abstract: The transitional Markov chain Monte Carlo (TMCMC) is one of the efficient algorithms for performing Markov chain Monte Carlo (MCMC) in the context of Bayesian uncertainty quantification in parallel computing architectures. However, the features that are associated with its efficient sampling are also responsible for its introducing of bias in the sampling. We demonstrate that the Markov chains of each subsample in TMCMC may result in uneven chain lengths that distort the intermediate target distributions and i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
35
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 34 publications
(35 citation statements)
references
References 26 publications
0
35
0
Order By: Relevance
“…Note that computation of evidence is usually a challenging task; however, the implemented TMCMC algorithm is capable of providing model evidence as a by‐product with no additional computation cost. Recent studies have shown that the TMCMC method provides slightly biased estimates for the evidence . However, such bias does not affect the qualitative results presented in this paper because the evidence estimates for each model class are far apart, as denoted in Table .…”
Section: Numerical Evaluation Of the Proposed Frameworkmentioning
confidence: 88%
“…Note that computation of evidence is usually a challenging task; however, the implemented TMCMC algorithm is capable of providing model evidence as a by‐product with no additional computation cost. Recent studies have shown that the TMCMC method provides slightly biased estimates for the evidence . However, such bias does not affect the qualitative results presented in this paper because the evidence estimates for each model class are far apart, as denoted in Table .…”
Section: Numerical Evaluation Of the Proposed Frameworkmentioning
confidence: 88%
“…For each EIM basis, 2500 posterior samples of LJ and σ LJ are recorded to be used in the post-processing step of our HSM analysis. We use BASIS to draw the samples because it is highly parallel and the evidence is a by-product of the algorithm [31]. Then, we draw the posterior samples of the hyperparameters ψ with the post-processing step in our method.…”
Section: Molecular Dynamics: Kryptonmentioning
confidence: 99%
“…The marginalized posterior of θ is estimated by substituting Equation 39 and 41 into Equation 31, which we substitute ψ by σ y :…”
Section: A1 Non-hierarchical Model M 1amentioning
confidence: 99%
“…In the original TMCMC the MCMC is performed for each unique sample inΘ j+1 with chain length equal to number of occurrences of the sample. In [40] it is shown that in order for the bias to be reduced all samples iñ Θ j+1 should perform an MCMC step with chain length equal to a parameter max , which is usually set to 1. The improved algorithm is called BASIS and an efficient implementation can be found in the Π4U framework [18].…”
Section: Transitional Markov Chain Monte Carlo (Tmcmc)mentioning
confidence: 99%