“…We build on VAEs [29,35], a latent variable modeling framework shown to work well for learning latent representations (also called encodings/embeddings) [20,24,57,14,53,8,45,2] and capturing the generative process [36,53,46,54]. VAEs [29,35] introduce a latent variable z, an encoder q φ , a decoder p θ , and a prior distribution p on z. φ and θ are the parameters of the q and p respectively, often instantiated with neural networks.…”