2023
DOI: 10.1214/22-ba1328
|View full text |Cite
|
Sign up to set email alerts
|

Regularized Zero-Variance Control Variates

Abstract: Zero-variance control variates (ZV-CV) are a post-processing method to reduce the variance of Monte Carlo estimators of expectations using the derivatives of the log target. Once the derivatives are available, the only additional computational effort lies in solving a linear regression problem. Significant variance reductions have been achieved with this method in low dimensional examples, but the number of covariates in the regression rapidly increases with the dimension of the target. In this paper, we prese… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 83 publications
0
7
0
Order By: Relevance
“…Again, the extension is straightforward and consists of applying the ideas from this section to improve multiple expectations. The ZVCV package (South 2020) on CRAN provides functions to apply ZVCV and CF to two estimators of the normalizing constant.…”
Section: Discussionmentioning
confidence: 99%
“…Again, the extension is straightforward and consists of applying the ideas from this section to improve multiple expectations. The ZVCV package (South 2020) on CRAN provides functions to apply ZVCV and CF to two estimators of the normalizing constant.…”
Section: Discussionmentioning
confidence: 99%
“…This is a significant advantage in the present setting since many applications, including problems where π is a Bayesian posterior distribution, fall into this category. The first Stein-based CVs were proposed by Assaraf and Caffarel [1999], in which U was a finite-dimensional vector space of functions of the form u = ∇p, with p polynomial of fixed degree; see also Mira et al [2013], Papamarkou et al [2014, Friel et al [2014], South et al [2022b]. For additional flexibility, Oates et al [2017] proposed to take U to be a Cartesian product of reproducing kernel Hilbert spaces; see also , Oates et al [2019], Barp et al [2022], calling this approach control functionals (CFs).…”
Section: Control Variate Methodsmentioning
confidence: 99%
“…For Neural CVs it is difficult to go beyond Theorem 1, since for one thing there will not be a unique γ meta in general. However, for simpler CVs, such as those based on polynomial regression [Assaraf and Caffarel, 1999, Mira et al, 2013, Papamarkou et al, 2014, Friel et al, 2014, South et al, 2022b, it is reasonable to assume a unique γ meta and convexity of the Meta-CV objective around this point. In these scenarios, the following corollary shows that γ is typically close to the minimiser of the task-specific objective functional.…”
Section: Theoretical Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Note that an intercept = TRUE flag may be changed to intercept = FALSE within the function if integrand_logged = TRUE and a NaN is encountered. See South et al (2018) for further details. • polyorder_max: The maximum allowable polynomial order.…”
Section: Formatmentioning
confidence: 99%