Wiley StatsRef: Statistics Reference Online 2021
DOI: 10.1002/9781118445112.stat08284
|View full text |Cite
|
Sign up to set email alerts
|

Advances in Importance Sampling

Abstract: Importance sampling (IS) is a Monte Carlo technique for the approximation of intractable distributions and integrals with respect to them. The origin of IS dates from the early 1950s. In the last decades, the rise of the Bayesian paradigm and the increase of the available computational resources have propelled the interest in this theoretically sound methodology. In this paper, we first describe the basic IS algorithm and then revisit the recent advances in this methodology. We pay particular attention to two … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 38 publications
(20 citation statements)
references
References 98 publications
0
20
0
Order By: Relevance
“…through the self-normalized importance sampling (SNIS) estimator (Elvira and Martino, 2021). These importance sampling weights can be used to obtain summary statistics/distributions of interest.…”
Section: Algorithmmentioning
confidence: 99%
See 2 more Smart Citations
“…through the self-normalized importance sampling (SNIS) estimator (Elvira and Martino, 2021). These importance sampling weights can be used to obtain summary statistics/distributions of interest.…”
Section: Algorithmmentioning
confidence: 99%
“…This computational cost is in terms of (1) time per each iteration (due to the number of auxiliary variables and cost to evaluate the likelihood function), and (2) length of MCMC simulations required since poorer mixing is often observed due to increased correlation between the parameters (notably for the random effects, ǫ 1 , and σ 2 ). Alternatively, smaller subsamples provide subposteriors for which it is (relatively) compu-tationally fast to obtain a sample from but where the following importance sampling algorithm may suffer from increased particle depletion due to differences between the subposterior and full posterior (see, Elvira and Martino, 2021 for further discussion). Further, in this case, there is an increased computational cost in the calculation of the importance sampling weight, as this is a function of the remaining data, though this is minimised when using an alternative (biased) weight calculation making use of repeated histories (see Section 4.3, consideration (iii)).…”
Section: Subsample Sizementioning
confidence: 99%
See 1 more Smart Citation
“…QMC with such low-discrepancy point sets can reach rates of convergence arbitrarily close to O N −1 when f satisfies some regularity conditions (Papageorgiou, 2003). In comparison, common variance-reduction techniques in reinforcement learning (e.g., control variates or importance sampling) only improve the constant factors in front of the O N −1 /2 rate (Glynn and Szechtman, 2002;Elvira and Martino, 2021).…”
Section: Introductionmentioning
confidence: 99%
“…In order to address such questions, we bring together in one place, and using a common notational framework, the four main 1 We acknowledge, of course, that there have been many concurrent advances in MCMC and importance sampling designed, in particular, to deal with the problem of scale. We refer the reader to: Green et al (2015), Robert et al (2018) and Dunson and Johndrow (2019) for broad overviews of modern developments in MCMC; to Betancourt (2018) for a review of Hamiltonian Monte Carlo (HMC); to Naesseth et al (2019) for a recent review of sequential Monte Carlo (exploiting importance sampling principles as it does); and to Hoogerheide et al (2009), Tokdar and Kass (2010) and Elvira and Martino (2021) for other advances in importance sampling.…”
Section: Introductionmentioning
confidence: 99%