2020
DOI: 10.48550/arxiv.2002.07101
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Augmented Normalizing Flows: Bridging the Gap Between Generative Flows and Latent Variable Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
41
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 23 publications
(43 citation statements)
references
References 0 publications
2
41
0
Order By: Relevance
“…VAEs VDVAE (Child, 2020) 2.87 -NVAE (Vahdat & Kautz, 2020) 2.91 23.49 BIVA (Maaløe et al, 2019) 3.08 -Flows FFJORD (Grathwohl et al, 2018) 3.40 -VFlow (Chen et al, 2020) 2.98 -ANF (Huang et al, 2020) 3.05 -…”
Section: Resultsmentioning
confidence: 99%
“…VAEs VDVAE (Child, 2020) 2.87 -NVAE (Vahdat & Kautz, 2020) 2.91 23.49 BIVA (Maaløe et al, 2019) 3.08 -Flows FFJORD (Grathwohl et al, 2018) 3.40 -VFlow (Chen et al, 2020) 2.98 -ANF (Huang et al, 2020) 3.05 -…”
Section: Resultsmentioning
confidence: 99%
“…Note that we did not focus on training our models towards high likelihood. Furthermore, and Huang et al (2020) recently trained augmented Normalizing Flows, which have conceptual similarities with our velocity augmentation. Methods leveraging auxiliary variables similar to our velocities are also used in statistics-such as Hamiltonian Monte Carlo (Neal, 2011)-and have found applications, for instance, in Bayesian machine learning Ding et al, 2014;Shang et al, 2015).…”
Section: Related Workmentioning
confidence: 99%
“…The most basic questions revolve around the representational power of such models. Even the question of universal approximation was only recently answered by three concurrent papers (Huang et al, 2020;Zhang et al, 2020;Koehler et al, 2020)-though in a less-than-satisfactory manner, in light of how normalizing flows are trained. Namely, Huang et al (2020); Zhang et al (2020) show that any (reasonably well-behaved) distribution p, once padded with zeros and treated as a distribution in R d+d ′ , can be arbitrarily closely approximated by an affine coupling flow.…”
Section: Introductionmentioning
confidence: 99%
“…Even the question of universal approximation was only recently answered by three concurrent papers (Huang et al, 2020;Zhang et al, 2020;Koehler et al, 2020)-though in a less-than-satisfactory manner, in light of how normalizing flows are trained. Namely, Huang et al (2020); Zhang et al (2020) show that any (reasonably well-behaved) distribution p, once padded with zeros and treated as a distribution in R d+d ′ , can be arbitrarily closely approximated by an affine coupling flow. While such padding can be operationalized as an algorithm by padding the training image with zeros, it is never done in practice, as it results in an ill-conditioned Jacobian.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation