2020
DOI: 10.1287/mnsc.2018.3250
|View full text |Cite
|
Sign up to set email alerts
|

Randomized Dimension Reduction for Monte Carlo Simulations

Abstract: We present a new unbiased algorithm that estimates the expected value of f (U ) via Monte Carlo simulation, where U is a vector of d independent random variables, and f is a function of d variables. We assume that f does not depend equally on all its arguments. Under certain conditions we prove that, for the same computational cost, the variance of our estimator is lower than the variance of the standard Monte Carlo estimator by a factor of order d. Our method can be used to obtain a low-variance unbiased esti… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
13
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(13 citation statements)
references
References 40 publications
0
13
0
Order By: Relevance
“…This section considers a class of MLMC unbiased estimators of E(f (U )) based on successive approximations of f by deterministic functions of its first components. In (Kahalé 2020b), a lower bound on the work-normalized variance of such estimators is given in terms of that of the randomized dimension reduction estimator. This section provides a lower bound on the work-normalized variance of these estimators in terms of the truncation dimension.…”
Section: The Lower Boundmentioning
confidence: 99%
See 2 more Smart Citations
“…This section considers a class of MLMC unbiased estimators of E(f (U )) based on successive approximations of f by deterministic functions of its first components. In (Kahalé 2020b), a lower bound on the work-normalized variance of such estimators is given in terms of that of the randomized dimension reduction estimator. This section provides a lower bound on the work-normalized variance of these estimators in terms of the truncation dimension.…”
Section: The Lower Boundmentioning
confidence: 99%
“…Owen (2019) gives a recent survey on the effective dimension. Kahalé (2020b) studies the relationship between the truncation dimension and the randomized dimension reduction method, a recent variance reduction technique applicable to high-dimensional problems.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Vihola (2018) describes stratified RMLMC methods that, under certain conditions, are shown to be asymptotically as efficient as MLMC. The randomized dimension reduction method, recently introduced in (Kahalé 2016, Kahalé 2019, is another technique that can provably achieve substantial variance reduction in high-dimensional settings, such as the estimation of the expectation of a functional of a timevarying Markov chain at a long horizon.…”
Section: Introductionmentioning
confidence: 99%
“…In (Rhee and Glynn 2015, Section 3), an algorithm that finds in O(m 3 ) time an m-truncated distribution that optimizes the efficiency of these estimators is given. On the other hand, the asymptotic efficiency of the randomized dimension reduction method is maximized in (Kahalé 2019) via a new geometric algorithm that solves an m-dimensional optimization problem in O(m) time. Kahalé (2019) points out that the same geometric algorithm solves the optimization problem in (Rhee and Glynn 2015, Section 3) in O(m) time.…”
Section: Introductionmentioning
confidence: 99%