2016
DOI: 10.48550/arxiv.1605.08803
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Density estimation using Real NVP

Abstract: Unsupervised learning of probabilistic models is a central yet challenging problem in machine learning. Specifically, designing models with tractable learning, sampling, inference and evaluation is crucial in solving this task. We extend the space of such models using real-valued non-volume preserving (real NVP) transformations, a set of powerful, stably invertible, and learnable transformations, resulting in an unsupervised learning algorithm with exact log-likelihood computation, exact and efficient sampling… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

3
872
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 505 publications
(947 citation statements)
references
References 35 publications
3
872
0
Order By: Relevance
“…is the determinant of the neural network function's Jacobian matrix. Therefore, the probability of a random sample from q θ (x) is easy to evaluate when the Jacobian matrix is computationally tractable, as is the case with commonly used normalizing flow forms includes NICE (Dinh et al 2014), Real-NVP (Dinh et al 2016), and Glow (Kingma & Dhariwal 2018). As described in Section 4.3, in this work we choose to use a Real-NVP generative network to parameterize q θ (x).…”
Section: Normalizing Flowmentioning
confidence: 99%
See 1 more Smart Citation
“…is the determinant of the neural network function's Jacobian matrix. Therefore, the probability of a random sample from q θ (x) is easy to evaluate when the Jacobian matrix is computationally tractable, as is the case with commonly used normalizing flow forms includes NICE (Dinh et al 2014), Real-NVP (Dinh et al 2016), and Glow (Kingma & Dhariwal 2018). As described in Section 4.3, in this work we choose to use a Real-NVP generative network to parameterize q θ (x).…”
Section: Normalizing Flowmentioning
confidence: 99%
“…In both exoplanet astrometry and black hole feature extraction problems, we use a Real-NVP model with 32 affine coupling layers as the variational density function (Dinh et al 2016). Each affine-coupling layer is composed of a neural neural network with 3 fully connected layers, where the width (number of neurons) of each fully connected layer is 16 times of the dimension of inferred parameters.…”
Section: Implementation Detailsmentioning
confidence: 99%
“…For object detection (5.5), we use ResNet-50 [He et al, 2016]. Inspired by Dinh et al [2016], we include both additive and multiplicative coupling in the residual blocks. At the same time, since we do not need bijectivity of the operator (and proximal operator should not be) nor access to the determinant of the Jacobian, we do not restrict ourselves to a map with triangular structure like Dinh et al [2016].…”
Section: B Network Architecturesmentioning
confidence: 99%
“…Each coupling layer g j transforms a set of active gauge links using a set of gauge-invariant objects, such as plaquettes. The maps can be represented by tunable convolutional networks with few hidden layers [19][20][21]. Keeping track of the phase space deformation, this transformation can be computationally simplified using active, passive, and static masks, such that the Jacobian of the transformation becomes triangular.…”
mentioning
confidence: 99%