2021
DOI: 10.48550/arxiv.2112.00713
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

hIPPYlib-MUQ: A Bayesian Inference Software Framework for Integration of Data with Complex Predictive Models under Uncertainty

Abstract: Bayesian inference provides a systematic framework for integration of data with mathematical models to quantify the uncertainty in the solution of the inverse problem. However, solution of Bayesian inverse problems governed by complex forward models described by partial differential equations (PDEs) remains prohibitive with black-box Markov chain Monte Carlo (MCMC) methods. We present hIPPYlib-MUQ, an extensible and scalable software framework that contains implementations of state-of-the art algorithms aimed … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 58 publications
0
3
0
Order By: Relevance
“…Another MCMC sampling approach is the generalized preconditioned Crank-Nicholson (gpCN) method [27,28]. An attractive choice for the preconditioner is the Hessian at the MAP point, [29]. For these and other MCMC samplers, one typically needs to apply the inverse Hessian H −1 k or its square root H −1/2 k repeatedly and efficiently, which also motivates the study presented in this paper.…”
Section: Bayesian Inverse Problemsmentioning
confidence: 99%
“…Another MCMC sampling approach is the generalized preconditioned Crank-Nicholson (gpCN) method [27,28]. An attractive choice for the preconditioner is the Hessian at the MAP point, [29]. For these and other MCMC samplers, one typically needs to apply the inverse Hessian H −1 k or its square root H −1/2 k repeatedly and efficiently, which also motivates the study presented in this paper.…”
Section: Bayesian Inverse Problemsmentioning
confidence: 99%
“…This choice has two advantages: first, it is integer-valued, and second, it is easy to obtain a factored form of the covariance operator. For these reasons, this approach is attractive from a computational standpoint, and has since become popular and has resulted in scalable software implementations [31,21]. A similar approach can also be found in [25], in which the authors also used the Whittle-Matérn priors for Bayesian inverse problems.…”
Section: Motivation and Introductionmentioning
confidence: 99%
“…Other samplers with dimension-independent convergence properties, utilizing derivative information, include ∞-MALA, ∞-HMC and their manifold variants [2] and DILI [7]. The papers [2,20] provide a systematic comparison of a number of the above algorithms applied to high-dimensional Bayesian inverse problems. Outside of MCMC, [1] considers the performance of importance sampling on general state spaces, and its dependence on the discretization dimension and effective dimension of the problem.…”
mentioning
confidence: 99%