2019
DOI: 10.1016/j.jcp.2018.12.008
|View full text |Cite
|
Sign up to set email alerts
|

Localization for MCMC: sampling high-dimensional posterior distributions with local structure

Abstract: We investigate how ideas from covariance localization in numerical weather prediction can be used in Markov chain Monte Carlo (MCMC) sampling of high-dimensional posterior distributions arising in Bayesian inverse problems. To localize an inverse problem is to enforce an anticipated "local" structure by (i ) neglecting small off-diagonal elements of the prior precision and covariance matrices; and (ii ) restricting the influence of observations to their neighborhood. For linear problems we can specify the cond… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
37
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 34 publications
(38 citation statements)
references
References 63 publications
(110 reference statements)
1
37
0
Order By: Relevance
“…In this paper, we discuss how to infer initial conditions of high dimensional SDEs with local interactions, when noisy observations are available. Previous research [18] has shown that Metropoliswithin-Gibbs (MwG) sampling has dimension independent convergence rate. However, each MwG iteration requires a computation cost of O(n 2 ), with n being the model dimension.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…In this paper, we discuss how to infer initial conditions of high dimensional SDEs with local interactions, when noisy observations are available. Previous research [18] has shown that Metropoliswithin-Gibbs (MwG) sampling has dimension independent convergence rate. However, each MwG iteration requires a computation cost of O(n 2 ), with n being the model dimension.…”
Section: Discussionmentioning
confidence: 99%
“…In [18], it is shown that MwG has dimension independent performance if 1) p 0 is a Gaussian distribution, and its covariance or precision matrix is close to be banded; 2) each component of y has significant dependence only on a few components of x. It is also discussed how to truncate the far-off diagonal entries of the prior covariance or precision matrix.…”
Section: )mentioning
confidence: 99%
See 2 more Smart Citations
“…Local particle filters are reviewed in [ 21 ] and their potential to beat the curse of dimension is investigated from a theoretical viewpoint in [ 16 ]. Localization is popular in ensemble Kalman filters [ 20 ] and has been employed in Markov chain Monte Carlo [ 22 , 23 ]. Our focus in this paper is not on localization but rather on providing a unified and accessible understanding of the roles that dimension, noise-level and other model parameters play in approximating the Bayesian update.…”
Section: Introductionmentioning
confidence: 99%