2020
DOI: 10.1007/s00211-020-01131-1
|View full text |Cite
|
Sign up to set email alerts
|

On the convergence of the Laplace approximation and noise-level-robustness of Laplace-based Monte Carlo methods for Bayesian inverse problems

Abstract: The Bayesian approach to inverse problems provides a rigorous framework for the incorporation and quantification of uncertainties in measurements, parameters and models. We are interested in designing numerical methods which are robust w.r.t. the size of the observational noise, i.e., methods which behave well in case of concentrated posterior measures. The concentration of the posterior is a highly desirable situation in practice, since it relates to informative or large data. However, it can pose a computati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
54
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
10

Relationship

1
9

Authors

Journals

citations
Cited by 64 publications
(55 citation statements)
references
References 42 publications
1
54
0
Order By: Relevance
“…In case of the current version of the BayesFactor package, such an option is an alternative algorithm using Laplace approximation. Because this optional algorithm does not use any sampling, it will provide stable results across runs (Schillings, Sprungk, & Wacker, 2020; for alternative algorithms with stable, analytic results, see Chen, Villa, & Ghattas, 2017;Schillings & Schwab, 2013). At the same time, Laplace approximation can return systematically different results than (many iterations of) MCMC-based methods, especially when sample sizes are small.…”
Section: Discussionmentioning
confidence: 99%
“…In case of the current version of the BayesFactor package, such an option is an alternative algorithm using Laplace approximation. Because this optional algorithm does not use any sampling, it will provide stable results across runs (Schillings, Sprungk, & Wacker, 2020; for alternative algorithms with stable, analytic results, see Chen, Villa, & Ghattas, 2017;Schillings & Schwab, 2013). At the same time, Laplace approximation can return systematically different results than (many iterations of) MCMC-based methods, especially when sample sizes are small.…”
Section: Discussionmentioning
confidence: 99%
“…The Laplace approximation is exact when the parameter-to-observable map f(m) is linear, and often provides a reasonable approximation to the posterior, since the error is of the order of the departure from linearity of the parameter-to-observable map (Helin and Kretschmann 2020). Moreover, in the limit of small noise or large data, the Laplace approximation converges to the posterior (Schillings, Sprungk and Wacker 2020).…”
Section: Bayesian Formulationmentioning
confidence: 99%
“…Additional tricks are necessary to make such methods work for very concentrated posterior distributions. In Schillings et al (2019), estimators of this type are derived with performance that is independent of the noise level in the observations. It would be interesting to investigate their applicability in a MLMC context.…”
Section: Approaches For Multilevel Monte Carlo Estimationmentioning
confidence: 99%