2020
DOI: 10.48550/arxiv.2006.02682
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Some Theoretical Insights into Wasserstein GANs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…To the best of our knowledge, this is the first result assessing the influence of the noise and of the contamination on the error of generative modeling. This constitutes an appealing complement to the recently obtained statistical guarantees (Biau et al, 2020b;Luise et al, 2020).…”
Section: Discussionmentioning
confidence: 88%
See 1 more Smart Citation
“…To the best of our knowledge, this is the first result assessing the influence of the noise and of the contamination on the error of generative modeling. This constitutes an appealing complement to the recently obtained statistical guarantees (Biau et al, 2020b;Luise et al, 2020).…”
Section: Discussionmentioning
confidence: 88%
“…Without any smoothness assumptions, Biau et al (2020a) provide large sample properties of the estimated distribution assuming that all the densities induced by the class of generators are dominated by a fixed known measure on a Borel subset of R D . When the admissible discriminators are neural networks with a given architecture, Biau et al (2020b) obtained the parametric rate n −1/2 . To our knowledge, Luise et al (2020) is the only work which establishes statistical guarantees under the assumption that the data generating process is a smooth transformation of a lowdimensional latent distribution.…”
Section: Related Work (And Contributions)mentioning
confidence: 99%
“…The main strand of research on GANs deals with empirical insights and basic mathematical properties. Recently researchers started to analyze the GAN problem from the statistical perspectives [Biau et al, 2020b,a, Liang, 2018, Singh et al, 2018, Luise et al, 2020, Uppal et al, 2019 as well as optimization and algorithmic viewpoints [Liang and Stokes, 2019, Kodali et al, 2017, Pfau and Vinyals, 2016, Nie and Patel, 2020, Nagarajan and Kolter, 2017, Genevay et al, 2018.…”
Section: Introductionmentioning
confidence: 99%
“…Further theoretical work on generative adversarial learning that was recently published includes theory on Wasserstein methods [4], as well as theory on domain shifts quantified by means of an adversarial loss that reduces Jensen-Shannon divergence [26].…”
Section: Introductionmentioning
confidence: 99%