2015
DOI: 10.48550/arxiv.1505.05770
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Variational Inference with Normalizing Flows

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
307
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
2
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 221 publications
(309 citation statements)
references
References 0 publications
1
307
0
1
Order By: Relevance
“…However, most methods do not model data density on the manifold and are thus not applicable for the same purposes as the models introduced in our work. Popular density-based probabilistic models such as Variational Auto-Encoder [21] or Flow variants [20,35,11,26,38] assume the data distribution is a.c. and as such cannot model distributions that lie on a low-dimensional manifold. A number of recent works do however introduce normalizing flows on manifolds such as Riemannian Flows [14,36] and M-Flow [5].…”
Section: Comparions With Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…However, most methods do not model data density on the manifold and are thus not applicable for the same purposes as the models introduced in our work. Popular density-based probabilistic models such as Variational Auto-Encoder [21] or Flow variants [20,35,11,26,38] assume the data distribution is a.c. and as such cannot model distributions that lie on a low-dimensional manifold. A number of recent works do however introduce normalizing flows on manifolds such as Riemannian Flows [14,36] and M-Flow [5].…”
Section: Comparions With Related Workmentioning
confidence: 99%
“…Normalizing flows [35,22] have shown considerable potential for the task of modelling and inferring expressive distributions through the learning of well-specified probabilistic models. Specifically, assume an absolutely continuous 2 (a.c.) random variable (r.v.)…”
Section: Introductionmentioning
confidence: 99%
“…Normalizing Flows are a class of generative models where an 'simple' paramaterizable base distribution is transformed into a more complex approximation for the posterior distribution (Rezende and Mohamed 2015). This transformation is achieved by passing the base distribution through a series of invertible and bijective mappings.…”
Section: Flow-based Generative Modelsmentioning
confidence: 99%
“…Other popular approaches in this domain include fully visible belief networks like NADE (Uria et al 2016), MADE (Kumar et al 2016), PixelCNN, PixelRNN Van den Oord et al 2016) and WaveNet . Normalizing Flows (Rezende and Mohamed 2015) fall under the category broadly known as change of variable models, which employ the theorem to transform a simple parameterizable base distribution into some complex approximation of the posterior distribution. In contrast to GANs and VAEs, normalizing flows are an attractive class due to their ability to obtain tractable density estimates.…”
Section: Normalizing Flowsmentioning
confidence: 99%
See 1 more Smart Citation