2019
DOI: 10.48550/arxiv.1910.14090
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Conditional Density Estimation, Latent Variable Discovery and Optimal Transport

Abstract: A framework is proposed that addresses both conditional density estimation and latent variable discovery. The objective function maximizes explanation of variability in the data, achieved through the optimal transport barycenter generalized to a collection of conditional distributions indexed by a covariate -either given or latent-in any suitable space. Theoretical results establish the existence of barycenters, a minimax formulation of optimal transport maps, and a general characterization of variability via … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 41 publications
0
3
0
Order By: Relevance
“…The methodology for the solution of the data-driven distributional barycenter problem proposed in this article can be used with general cost functions. It improves significantly over previous approaches to the barycenter problem based on adversarial games ( [7,28,34]). The latter have two players: one that proposes cost-minimizing maps through time-evolving flows, and another that builds test functions to enforce the push-forward condition.…”
Section: Introductionmentioning
confidence: 72%
See 1 more Smart Citation
“…The methodology for the solution of the data-driven distributional barycenter problem proposed in this article can be used with general cost functions. It improves significantly over previous approaches to the barycenter problem based on adversarial games ( [7,28,34]). The latter have two players: one that proposes cost-minimizing maps through time-evolving flows, and another that builds test functions to enforce the push-forward condition.…”
Section: Introductionmentioning
confidence: 72%
“…Optimal transport and the related Wasserstein barycenter problem have undergone rapid development during the last ten years, with a particular focus on applications to the analysis of data and machine learning [11], ranging from gene expression [24] to economics [9]. Procedures based on optimal transport have been used for density and conditional density estimation [30,28], data augmentation [19], image classification [12,31,35], computer vision [25,32,2,20], factor discovery [34] and data imputation [26].…”
Section: Introductionmentioning
confidence: 99%
“…Since the optimal discriminator gives the Jensen-Shannon divergence up to a constant, this choice is valid in (2). Indeed, as shown in [32], a variety of divergences may be tightly lower-bounded by similar min-max functionals-a more general feature of dual formulations of measure transport problems [47]. We approximate the components K and F of T with neural networks and replace the strict monotonicity constraint on T with an average monotonicity constraint.…”
Section: The Training and Sampling Proceduresmentioning
confidence: 99%