2021
DOI: 10.48550/arxiv.2105.00520
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Sampling by Divergence Minimization

Abstract: We introduce a family of Markov Chain Monte Carlo (MCMC) methods designed to sample from target distributions with irregular geometry using an adaptive scheme. In cases where targets exhibit non-Gaussian behaviour, we propose that adaption should be regional in nature as opposed to global. Our algorithms minimize the information projection side of the Kullback-Leibler (KL) divergence between the proposal distribution class and the target to encourage proposals distributed similarly to the regional geometry of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 15 publications
(17 reference statements)
0
1
0
Order By: Relevance
“…Variations of the entropy objective. Recent work [18,11] have suggested to add the crossentropy term π(q) r(q |q) log π(q )dq dq to the entropy objective for optimizing the parameters of a Metropolis-Hastings kernel with proposal density r(q |q). Algorithm 1 can be adjusted to such variations, possibly by stopping gradients through ∇U as for optimizing the energy error term.…”
Section: Discussion and Outlookmentioning
confidence: 99%
“…Variations of the entropy objective. Recent work [18,11] have suggested to add the crossentropy term π(q) r(q |q) log π(q )dq dq to the entropy objective for optimizing the parameters of a Metropolis-Hastings kernel with proposal density r(q |q). Algorithm 1 can be adjusted to such variations, possibly by stopping gradients through ∇U as for optimizing the energy error term.…”
Section: Discussion and Outlookmentioning
confidence: 99%