2013
DOI: 10.1016/j.spa.2012.12.001
|View full text |Cite
|
Sign up to set email alerts
|

Advanced MCMC methods for sampling on diffusion pathspace

Abstract: The need to calibrate increasingly complex statistical models requires a persistent effort for further advances on available, computationally intensive Monte-Carlo methods. We study here an advanced version of familiar Markov-chain Monte-Carlo (MCMC) algorithms that sample from target distributions defined as change of measures from Gaussian laws on general Hilbert spaces. Such a model structure arises in several contexts: we focus here at the important class of statistical models driven by diffusion paths whe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 19 publications
(24 citation statements)
references
References 34 publications
0
24
0
Order By: Relevance
“…In addition, the field of pseudo-Riemannian geometry deals with forms of G(x), which need not be positive-definite [39]; so again, understanding could be gained from here. Some recent work in high-dimensional inference has centred on defining MCMC methods for which efficiency scales O(1) with respect to the dimension, n, of π(·) [19,59]. In the case where X takes values in some infinite-dimensional function space, this can be done provided a Gaussian prior measure is defined for X.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…In addition, the field of pseudo-Riemannian geometry deals with forms of G(x), which need not be positive-definite [39]; so again, understanding could be gained from here. Some recent work in high-dimensional inference has centred on defining MCMC methods for which efficiency scales O(1) with respect to the dimension, n, of π(·) [19,59]. In the case where X takes values in some infinite-dimensional function space, this can be done provided a Gaussian prior measure is defined for X.…”
Section: Discussionmentioning
confidence: 99%
“…The key challenge for MCMC is to define transition kernels for which proposed moves are inside the support for π(·). A straight-forward approach is to define proposals for which the prior is invariant, since the likelihood contribution to the posterior typically will not alter its support from that of the prior [19]. However, the posterior may still look very different from the prior, as noted in [61], so this proposal mechanism, though O(1), can still result in slow exploration.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, the optimal value (in terms of maximising acceptance rate) may vary between observation intervals. Beskos et al (2013) use Hybrid Monte Carlo (HMC) on pathspace to generate SDE sample paths under various observation regimes. For the applications considered, the authors found reasonable gains in overall efficiency (as measured by minimum effective sample size per CPU time) over an independence sampler with a Brownian bridge proposal.…”
Section: Latent Valuesmentioning
confidence: 99%
“…Standard hybrid Monte Carlo algorithm We use the hybrid Monte Carlo algorithm to explore the posterior of Z, θ in (6). The standard method was introduced in Duane et al (1987), but we employ an advanced version, tailored to the structure of the distributions of interest and closely related to algorithms developed in Beskos et al (2011Beskos et al ( , 2013a for effective sampling of change of measures from Gaussian laws in infinite dimensions. We first briefly describe the standard algorithm.…”
Section: An Efficient Markov Chain Monte Carlo Samplermentioning
confidence: 99%