2019
DOI: 10.1007/s10444-019-09711-y
|View full text |Cite
|
Sign up to set email alerts
|

A transport-based multifidelity preconditioner for Markov chain Monte Carlo

Abstract: Markov chain Monte Carlo (MCMC) sampling of posterior distributions arising in Bayesian inverse problems is challenging when evaluations of the forward model are computationally expensive. Replacing the forward model with a low-cost, low-fidelity model often significantly reduces computational cost; however, employing a low-fidelity model alone means that the stationary distribution of the MCMC chain is the posterior distribution corresponding to the low-fidelity model, rather than the original posterior distr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
31
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 22 publications
(31 citation statements)
references
References 68 publications
0
31
0
Order By: Relevance
“…Apart from building transport maps using TT decompositions, most of other methods approximate the transport map T by solving an optimisation problem such that T minimises some statistical divergence between the target ν π and the pushforward T μ. The mapping T often has a triangular structure, which is computationally efficient for evaluating the Jacobian and the inverse of T and can be represented using polynomials [4,43,50,51], kernel functions [15,37], invertible neural networks [7,9,10,35,49,52], etc. In this setting, the objective function has to be approximated using a Monte Carlo average and minimised by some (stochastic) gradient-based method.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Apart from building transport maps using TT decompositions, most of other methods approximate the transport map T by solving an optimisation problem such that T minimises some statistical divergence between the target ν π and the pushforward T μ. The mapping T often has a triangular structure, which is computationally efficient for evaluating the Jacobian and the inverse of T and can be represented using polynomials [4,43,50,51], kernel functions [15,37], invertible neural networks [7,9,10,35,49,52], etc. In this setting, the objective function has to be approximated using a Monte Carlo average and minimised by some (stochastic) gradient-based method.…”
Section: Related Workmentioning
confidence: 99%
“…-Density approximation. The work of [4,43,51] aims to minimise the Kullback-Leibler (KL) divergence of the pushforward T μ from the target ν π , in which the pushforward density naturally approximates the target density. In this case, the KL divergence is approximated using the Jacobian of T and the target density function evaluated at samples drawn from the analytically tractable reference measure.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Alternatively, it is possible to construct a multilevel approximation without relying on such a telescoping sum. Examples are the multifidelity preconditioned MCMC method in [33], Multilevel Sequential 2 Monte Carlo [25], and our multilevel sparse Leja approximation presented in section 3.…”
Section: 2mentioning
confidence: 99%
“…The smoothness assumptions can be weakened by using piecewise polynomial approximations together with Voronoi tesselations of the parameter space [31]. Of course, surrogates can also be used to accelerate sampling-based approximations such as MCMC, see e.g., [26,33]. We remark that Quasi-Monte Carlo [8,35] is in principle a sampling-free method which does not rely on surrogates, however, it requires again a smooth approximand, and is often used together with randomization.…”
mentioning
confidence: 99%