2018
DOI: 10.48550/arxiv.1806.09060
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Disentangled VAE Representations for Multi-Aspect and Missing Data

Abstract: Many problems in machine learning and related application areas are fundamentally variants of conditional modeling and sampling across multi-aspect data, either multi-view, multi-modal, or simply multi-group. For example, sampling from the distribution of English sentences conditioned on a given French sentence or sampling audio waveforms conditioned on a given piece of text. Central to many of these problems is the issue of missing data: we can observe many English, French, or German sentences individually bu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 9 publications
0
5
0
Order By: Relevance
“…Another alternative for imputing missing data is to rely on unsupervised methods. These methods are based on variational autoencoders (VAEs) [60], [61] or generative adversarial networks (GANs) [62], [63]. In [64], a generic framework for missing data imputation on time series is developed by combining concepts from VAEs [65], Cauchy kernels [66], Gaussian Processes [67], structured variational distributions with efficient inference [68] and a particular Evidence Lower Bound (ELBO) for missing data [61].…”
Section: Reconstruction Methods a Scope And Backgroundmentioning
confidence: 99%
“…Another alternative for imputing missing data is to rely on unsupervised methods. These methods are based on variational autoencoders (VAEs) [60], [61] or generative adversarial networks (GANs) [62], [63]. In [64], a generic framework for missing data imputation on time series is developed by combining concepts from VAEs [65], Cauchy kernels [66], Gaussian Processes [67], structured variational distributions with efficient inference [68] and a particular Evidence Lower Bound (ELBO) for missing data [61].…”
Section: Reconstruction Methods a Scope And Backgroundmentioning
confidence: 99%
“…The researchers in [62]- [64] propose the use of non-linear dimensionality reduction to handle missing values in time series using variational autoencoders (VAE). GP-VAE proposed by [36] is especially focused on MTS.…”
Section: ) Vae-gan Modelsmentioning
confidence: 99%
“…Variational auto-encoders (VAEs) have been proposed for several problems within this definition of unsupervised reconstruction [10,5,7]. These methods lead to good single estimates of the underlying targets.…”
Section: Vaes and The Posterior Collapse Problemmentioning
confidence: 99%
“…Finally, Bayesian LVM methods have been used on other unsupervised tasks that can be cast as special cases of data recovery problems. Amongst these, we find Multi-view generation [30,7], where the target clean data includes all views for each samples, but the observed data only presents subsets. Blind source separation can also be cast as a recovery problem and has been approached with GANs and VAEs [31,32].…”
Section: Unsupervised Bayesian Reconstructionmentioning
confidence: 99%
See 1 more Smart Citation