2019
DOI: 10.3390/e21111096
|View full text |Cite
|
Sign up to set email alerts
|

Dynamical Sampling with Langevin Normalization Flows

Abstract: In Bayesian machine learning, sampling methods provide the asymptotically unbiased estimation for the inference of the complex probability distributions, where Markov chain Monte Carlo (MCMC) is one of the most popular sampling methods. However, MCMC can lead to high autocorrelation of samples or poor performances in some complex distributions. In this paper, we introduce Langevin diffusions to normalization flows to construct a brand-new dynamical sampling method. We propose the modified Kullback-Leibler dive… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(9 citation statements)
references
References 32 publications
0
9
0
Order By: Relevance
“…In our setup however, the exact sampling is guaranteed by the M-H process, thus the KL divergence loss is no longer applicable. Like in variational inference, the normalizing flow Langevin MC (NFLMC) [11] also used a KL divergence loss. Strictly speaking, this model is a normalizing flow but not an MCMC method.…”
Section: Related Work: Other Samplers Inspired By Hmcmentioning
confidence: 99%
See 1 more Smart Citation
“…In our setup however, the exact sampling is guaranteed by the M-H process, thus the KL divergence loss is no longer applicable. Like in variational inference, the normalizing flow Langevin MC (NFLMC) [11] also used a KL divergence loss. Strictly speaking, this model is a normalizing flow but not an MCMC method.…”
Section: Related Work: Other Samplers Inspired By Hmcmentioning
confidence: 99%
“…Recently, approaches have been proposed that inherit the exact sampling property from the MCMC method, while potentially mitigating the described issues of unfavorable geometry. One approach is MCMC samplers augmented with neural networks [9][10][11]; the other approach is neural transport MCMC techniques [12,13]. A disadvantage of these recent techniques is that their objectives optimize the quality of proposed samples, but do not explicitly encourage the exploration speed of the sampler.…”
Section: Introductionmentioning
confidence: 99%
“…Some methods only use permissions to classify the Android app [ 11 , 12 , 21 ], while other methods combine other features (such as API, CFG, etc.) to classify [ 22 , 23 , 24 ]. Wang et al [ 12 ] analyzed the risks of individual permissions and collaborative permissions.…”
Section: Introductionmentioning
confidence: 99%
“…In machine learning, complex probabilistic models usually need to calculate complex high-dimensional integrals [ 24 ]. For example, for a classification task, we need to predict the class of the instance.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation