2020
DOI: 10.48550/arxiv.2001.09266
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The reproducing Stein kernel approach for post-hoc corrected sampling

Liam Hodgkinson,
Robert Salomone,
Fred Roosta

Abstract: Stein importance sampling [42] is a widely applicable technique based on kernelized Stein discrepancy [43], which corrects the output of approximate sampling algorithms by reweighting the empirical distribution of the samples. A general analysis of this technique is conducted for the previously unconsidered setting where samples are obtained via the simulation of a Markov chain, and applies to an arbitrary underlying Polish space. We prove that Stein importance sampling yields consistent estimators for quantit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
30
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(30 citation statements)
references
References 38 publications
(91 reference statements)
0
30
0
Order By: Relevance
“…Liu and Lee (2017) considered the use of kernel Stein discrepancy to optimally weight an arbitrary set (X i ) n i=1 ⊂ R d of states in an manner loosely analogous to importance sampling, at a computational cost of O(n 3 ). The combined effect of applying the approach of Liu and Lee (2017) to MCMC output was analysed in Hodgkinson et al (2020), who established situations in which the overall procedure will be consistent. The present paper differs from Liu and Lee (2017) and Hodgkinson et al (2020) in that we attempt compression, rather than weighting of the MCMC output.…”
Section: Related Workmentioning
confidence: 99%
“…Liu and Lee (2017) considered the use of kernel Stein discrepancy to optimally weight an arbitrary set (X i ) n i=1 ⊂ R d of states in an manner loosely analogous to importance sampling, at a computational cost of O(n 3 ). The combined effect of applying the approach of Liu and Lee (2017) to MCMC output was analysed in Hodgkinson et al (2020), who established situations in which the overall procedure will be consistent. The present paper differs from Liu and Lee (2017) and Hodgkinson et al (2020) in that we attempt compression, rather than weighting of the MCMC output.…”
Section: Related Workmentioning
confidence: 99%
“…This result, and the related results in Liu & Lee (2017), Hodgkinson et al (2020), weaken or remove the requirement to design Markov chains that are exactly P -invariant. See also Gramacy et al (2010), Radivojević & Akhmatskaya (2020).…”
Section: Tail Condition For Stein Discrepancymentioning
confidence: 59%

Post-Processing of MCMC

South,
Riabiz,
Teymur
et al. 2021
Preprint
“…The requirement in Lemma 4 for the x x x (i) to be distinct precludes, for example, the direct use of Metropolis-Hastings output. However, as emphasized in Oates et al (2017) for control functionals and studied further in Liu and Lee (2017); Hodgkinson et al (2020), the consistency of I SECF does not require that the Markov chain is p-invariant. It is therefore trivial to, for example, filter out duplicate states from Metropolis-Hastings output.…”
Section: Computation For the Proposed Methodsmentioning
confidence: 99%
“…Note that our choice of Stein operator differs to that in Chwialkowski et al (2016) and Liu et al (2016). There has been substantial recent research into the use of kernel Stein discrepancies for assessing algorithm performance in the Bayesian computational context (Gorham and Mackey, 2017;Chen et al, 2018Chen et al, , 2019Singhal et al, 2019;Hodgkinson et al, 2020) and one can also exploit this discrepancy as a diagnostic for the performance of the semi-exact control functional. The second quantity |f | k 0 ,F in the bound can be approximated by q q q q q q q q q q 0.03 0.10 0.30 1.00 100 300 1000 m q q q q mean absolute error mean upper bound…”
Section: Diagnosticsmentioning
confidence: 99%