Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/305
|View full text |Cite
|
Sign up to set email alerts
|

Three-Player Wasserstein GAN via Amortised Duality

Abstract: We propose a new formulation for learning generative adversarial networks (GANs) using optimal transport cost (the general form of Wasserstein distance) as the objective criterion to measure the dissimilarity between target distribution and learned distribution. Our formulation is based on the general form of the Kantorovich duality which is applicable to optimal transport with a wide range of cost functions that are not necessarily metric. To make optimising this duality form amenable to gradient-based method… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 5 publications
0
1
0
Order By: Relevance
“…The distance W 2 2 is particularly popular in a type of generative model, the Wasserstein Generative Adversarial Network (WGAN) [68]. The MaxiMin formulation is used in [69] to enrich a GAN: the Wasserstein distance is computed by solving the dual formulation of the problem with an ANN as potential model. In order to obtain a mathematically explainable generative model, [70] incorporates Kantorovich potentials into the training of a GAN where the discriminator error is the distance W 2 2 .…”
Section: Uses Of Neural Network For Kantorovich Potentialsmentioning
confidence: 99%
“…The distance W 2 2 is particularly popular in a type of generative model, the Wasserstein Generative Adversarial Network (WGAN) [68]. The MaxiMin formulation is used in [69] to enrich a GAN: the Wasserstein distance is computed by solving the dual formulation of the problem with an ANN as potential model. In order to obtain a mathematically explainable generative model, [70] incorporates Kantorovich potentials into the training of a GAN where the discriminator error is the distance W 2 2 .…”
Section: Uses Of Neural Network For Kantorovich Potentialsmentioning
confidence: 99%