2020
DOI: 10.1016/j.patcog.2020.107514
|View full text |Cite
|
Sign up to set email alerts
|

Dirichlet Variational Autoencoder

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 69 publications
(26 citation statements)
references
References 7 publications
0
26
0
Order By: Relevance
“…In our application, it imposes a compact latent space, whose latent dimensions can be interpreted as mixture weights in a multinomial mixture model [49,50].…”
Section: Dirichlet-vaementioning
confidence: 99%
See 1 more Smart Citation
“…In our application, it imposes a compact latent space, whose latent dimensions can be interpreted as mixture weights in a multinomial mixture model [49,50].…”
Section: Dirichlet-vaementioning
confidence: 99%
“…We then study a generalisation, in which the Gaussian prior is replaced by a Gaussian-Mixture prior (GMVAE) [42][43][44][45][46][47][48] in Section 3. In Section 4 we introduce the Dirichlet-VAE (DVAE), which uses a compact latent space with a Dirichlet prior [49,50]. Through a specific choice of decoder architecture we can interpret the decoder weights as the parameters of the mixture distributions in the probabilistic model, and can visualise these to directly interpret what the neural network is learning.…”
Section: Introductionmentioning
confidence: 99%
“…Although the latent variables follow a Gaussian distribution in the normal VAE, models have been proposed in which other distributions were assumed (Jang et al, 2017;Srivastava and Sutton, 2017;Joo et al, 2019). These models made it possible to sample from the distribution, except for a Gaussian distribution, by formulating the sampling procedure into a differentiable form.…”
Section: Related Workmentioning
confidence: 99%
“…Srivastava et al used the Laplace approximation, whereby the parameters of a Dirichlet distribution were approximated by the parameters of a Gaussian distribution, making it possible to sample from a Dirichlet distribution by adding a softmax layer to the normal VAE (Srivastava and Sutton, 2017). Joo et al applied parameter estimation of a multivariate Gamma distribution using an inverse Gamma cumulative distribution function approximation to enable sampling from a Dirichlet distribution (Joo et al, 2019). In this study, we used the Gumbel-Softmax distribution to obtain samples from a multinomial distribution.…”
Section: Related Workmentioning
confidence: 99%
“…Calculating the gradient for the sampling process from the Dirichlet distribution is difficult. Researchers propose approximation methods [38,39,40,41] in order to apply Dirichlet distribution to the neural topic model. We follow the rejection sampling method [42] in this work.…”
Section: Overview Of Our Frameworkmentioning
confidence: 99%