2016
DOI: 10.1137/140995817
|View full text |Cite
|
Sign up to set email alerts
|

Accurate Solution of Bayesian Inverse Uncertainty Quantification Problems Combining Reduced Basis Methods and Reduction Error Models

Abstract: ReuseUnless indicated otherwise, fulltext items are protected by copyright with all rights reserved. The copyright exception in section 29 of the Copyright, Designs and Patents Act 1988 allows the making of a single copy solely for the purpose of non-commercial research or private study within the limits of fair dealing. The publisher or other rights-holder may allow further reproduction and re-use of this version -refer to the White Rose Research Online record for this item. Where records identify the publish… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
46
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 45 publications
(46 citation statements)
references
References 46 publications
0
46
0
Order By: Relevance
“…This work demonstrated significant improvements in accuracy with respect to the a priori multifidelity correction error model in high-dimensional parameter spaces. Follow-on work also demonstrated the promise of this kind of a posteriori error model in an uncertainty-quantification context [40].…”
Section: Introductionmentioning
confidence: 92%
“…This work demonstrated significant improvements in accuracy with respect to the a priori multifidelity correction error model in high-dimensional parameter spaces. Follow-on work also demonstrated the promise of this kind of a posteriori error model in an uncertainty-quantification context [40].…”
Section: Introductionmentioning
confidence: 92%
“…The backbone of the approximation developed here is a given reduced model, denoted F * (·), built using existing techniques, including grid coarsening, 13,16,35,36 linearization of the forward model, 11 and projection-based methods. [37][38][39][40][41] Our key contribution here is to present a new way to improve the approximation to the likelihood function by considering the posterior statistics of the numerical error of the reduced model. This leads to the ADA algorithm with substantial improvement in the computational efficiency compared to the classical DA or surrogate transition algorithms.…”
Section: Approximations To the Forward Map And Posterior Distributionmentioning
confidence: 99%
“…Under certain assumptions, it has been shown that the posterior distribution induced by the low-fidelity model converges to the original/highfidelity posterior distribution as the low-fidelity approximation is refined [39,18]. Other approaches adapt low-fidelity models [19,32] over a finite interval of posterior exploration, or quantify the error introduced by sampling the low-fidelity posterior distribution [21,35]. Yet another family of approaches incrementally and infinitely refines approximations of the forward model on-the-fly during MCMC sampling [16,15]; under appropriate conditions, these schemes guarantee that the MCMC chain asymptotically samples the highfidelity posterior distribution.…”
Section: Introductionmentioning
confidence: 99%