2018
DOI: 10.48550/arxiv.1810.01382
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Unbiased estimation of log normalizing constants with applications to Bayesian cross-validation

Maxime Rischard,
Pierre E. Jacob,
Natesh Pillai

Abstract: Posterior distributions often feature intractable normalizing constants, called marginal likelihoods or evidence, that are useful for model comparison via Bayes factors. This has motivated a number of methods for estimating ratios of normalizing constants in statistics. In computational physics the logarithm of these ratios correspond to free energy differences. Combining unbiased Markov chain Monte Carlo estimators with path sampling, also called thermodynamic integration, we propose new unbiased estimators o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 10 publications
(12 citation statements)
references
References 32 publications
0
12
0
Order By: Relevance
“…return (f t , J t ) 10: end function 11: (z, ∆ log p ω t ) ← odesolve(aug,(x, 0),0,T ) 12: return log p 0 (z) − ∆ log p ω t A number of techniques exist for debiasing the logarithm of ( 22) -see Rhee & Glynn (2015) and Rischard et al (2018), for example. Alternatively, we lie in the setting of semi-implicit variational inference seen in Yin & Zhou (2018) and Titsias & Ruiz (2019), and those techniques directly extend to our case as well.…”
Section: Resultsmentioning
confidence: 99%

Stochastic Normalizing Flows

Hodgkinson,
van der Heide,
Roosta
et al. 2020
Preprint
“…return (f t , J t ) 10: end function 11: (z, ∆ log p ω t ) ← odesolve(aug,(x, 0),0,T ) 12: return log p 0 (z) − ∆ log p ω t A number of techniques exist for debiasing the logarithm of ( 22) -see Rhee & Glynn (2015) and Rischard et al (2018), for example. Alternatively, we lie in the setting of semi-implicit variational inference seen in Yin & Zhou (2018) and Titsias & Ruiz (2019), and those techniques directly extend to our case as well.…”
Section: Resultsmentioning
confidence: 99%

Stochastic Normalizing Flows

Hodgkinson,
van der Heide,
Roosta
et al. 2020
Preprint
“…This is of interest as from the numerical results, they suggest alternative rates for (F3), different to that of the other variants. One could also provide an unbiased estimation of the NC [36], which has connections to MLMC [9,37]. In order to do so, one would require a modified multilevel analysis of the EnKBF, where one has uniform upper bounds, with respect to levels l = 1, .…”
Section: Discussionmentioning
confidence: 99%
“…From (1.1) -(1.2), V t and W t are independent d y and d x −dimensional Brownian motions respectively, with h : R dx → R dy and f : R dx → R dx denoting potentially nonlinear functions, and σ : R dx → R dx×dx acting as a diffusion coefficient. Aside from computing the filtering distribution, filtering can also be exploited to compute normalizing constants associated with the filtering distribution [7,8,21,27,36], i.e. the marginal likelihood, which is an important and useful computation in Bayesian statistics.…”
Section: Introductionmentioning
confidence: 99%
“…To reduce implementation effort, it is sometimes possible to introduce a path of distributions (π t ) so that only slight modifications to the existing MCMC algorithm are required to target each bridging distribution. We describe a concrete example taken from Rischard et al [2018].…”
Section: Path Of Least Coding Effortmentioning
confidence: 99%