2018
DOI: 10.1137/16m1093549
|View full text |Cite
|
Sign up to set email alerts
|

Importance Sampling and Necessary Sample Size: An Information Theory Approach

Abstract: Importance sampling approximates expectations with respect to a target measure by using samples from a proposal measure. The performance of the method over large classes of test functions depends heavily on the closeness between both measures. We derive a general bound that needs to hold for importance sampling to be successful, and relates the f -divergence between the target and the proposal to the sample size. The bound is deduced from a new and simple information theory paradigm for the study of importance… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

4
26
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 27 publications
(30 citation statements)
references
References 19 publications
4
26
0
Order By: Relevance
“…The idea of Proposition 2 is inspired by [ 7 ], but adapted here from relative entropy to -divergence. Our results complement sufficient conditions on the sample size derived in [ 1 ] and necessary conditions for un-normalized (as opposed to autonormalized) importance sampling in [ 8 ]. In Section 3 , Proposition 4 gives a closed formula for the -divergence between posterior and prior in a linear-Gaussian Bayesian inverse problem setting.…”
Section: Introductionsupporting
confidence: 78%
See 3 more Smart Citations
“…The idea of Proposition 2 is inspired by [ 7 ], but adapted here from relative entropy to -divergence. Our results complement sufficient conditions on the sample size derived in [ 1 ] and necessary conditions for un-normalized (as opposed to autonormalized) importance sampling in [ 8 ]. In Section 3 , Proposition 4 gives a closed formula for the -divergence between posterior and prior in a linear-Gaussian Bayesian inverse problem setting.…”
Section: Introductionsupporting
confidence: 78%
“…The idea of Proposition 2 is inspired by [ 7 ], but adapted here from relative entropy to -divergence. Our results complement sufficient conditions on the sample size derived in [ 1 ] and necessary conditions for un-normalized (as opposed to autonormalized) importance sampling in [ 8 ].…”
Section: Introductionsupporting
confidence: 78%
See 2 more Smart Citations
“…The proof follows the approach of [37] for evaluating moments of ratios. Despite the complicated dependence of error constants on the problem at hand, there is further evidence for the centrality of the second moment ρ in the paper [91]. There it is shown (see Remark 4) that, when ρ is finite, a necessary condition for accuracy within the class of functions with bounded second moment under the proposal, is that the sample size N is of the order of the χ 2 divergence, and hence of the order of ρ.…”
Section: Complementary Analyses Of Importance Sampling Error Providedmentioning
confidence: 90%