2019
DOI: 10.1145/3306346.3323009
|View full text |Cite
|
Sign up to set email alerts
|

Optimal multiple importance sampling

Abstract: Multiple Importance Sampling (MIS) is a key technique for achieving robustness of Monte Carlo estimators in computer graphics and other fields. We derive optimal weighting functions for MIS that provably minimize the variance of an MIS estimator, given a set of sampling techniques. We show that the resulting variance reduction over the balance heuristic can be higher than predicted by the variance bounds derived by Veach and Guibas, who assumed only non-negative weights in their proof. We theoretically analyze… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 38 publications
(25 citation statements)
references
References 27 publications
0
25
0
Order By: Relevance
“…Ideally, we hope these weighting functions can minimize the sum of variance at each position. Formally, we have Optimizing weighting functions to minimize the variance of ordinary MIS estimator has been studied by a recent work (Kondapaneni et al, 2019). Our setting is different from it in that (1) we focus on self-normalized MIS and that (2) the function f (•) is vector-valued instead of scalar-valued.…”
Section: Analysis On the Optimal Weighting Function In Multiple Impor...mentioning
confidence: 99%
“…Ideally, we hope these weighting functions can minimize the sum of variance at each position. Formally, we have Optimizing weighting functions to minimize the variance of ordinary MIS estimator has been studied by a recent work (Kondapaneni et al, 2019). Our setting is different from it in that (1) we focus on self-normalized MIS and that (2) the function f (•) is vector-valued instead of scalar-valued.…”
Section: Analysis On the Optimal Weighting Function In Multiple Impor...mentioning
confidence: 99%
“…The balance heuristic is also present in most of successful adaptive IS (AIS) methods, see [ 11 , 12 , 13 , 14 , 15 ], in particular in the case where all techniques are used to simulate the same number of samples. Recently, MIS has returned to the main focus in computer graphics in [ 16 ], where it is shown that allowing weights to be negative, the reduction over balance heuristics of the resulting MIS estimator can be higher than the one predicted by Veach bounds [ 17 ]; in [ 18 ], where one of the sampling techniques is optimized to decrease the overall variance of the resulting MIS estimator; in [ 19 ], where the weights are made proportional to the quotients of second moments divided by the variances of the independent techniques; in [ 20 ], where MIS is generalized to uncountably infinite sets of techniques.…”
Section: Introductionmentioning
confidence: 99%
“…Another venue would be to couple it with progressive photon mapping [8,9] or vertex merging techniques [10]. Yet another potential area is coupling it with the latest results on improved multiple importance sampling [15,19,29]. Finally, it would be interesting to automate the choice of the constant in our biased estimator to minimize total error, seen as the sum of bias and variance.…”
Section: Future Workmentioning
confidence: 99%