2021
DOI: 10.48550/arxiv.2105.13010
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

An error analysis of generative adversarial networks for learning distributions

Abstract: This paper studies how well generative adversarial networks (GANs) learn probability distributions from finite samples. Our main results estimate the convergence rates of GANs under a collection of integral probability metrics defined through Hölder classes, including the Wasserstein distance as a special case. We also show that GANs are able to adaptively learn data distributions with low-dimensional structure or have Hölder densities, when the network architectures are chosen properly. In particular, for dis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(6 citation statements)
references
References 38 publications
0
6
0
Order By: Relevance
“…Suppose g n,m ∈ G is a solution with optimization error ǫ opt . Using the argument in Huang et al (2021), one can show that g n,m achieves the same rate as g n in Theorem 4.6, if m is sufficiently large.…”
Section: If We Choosementioning
confidence: 92%
See 4 more Smart Citations
“…Suppose g n,m ∈ G is a solution with optimization error ǫ opt . Using the argument in Huang et al (2021), one can show that g n,m achieves the same rate as g n in Theorem 4.6, if m is sufficiently large.…”
Section: If We Choosementioning
confidence: 92%
“…Lemma 4.5 (Huang et al (2021)). Assume that F is symmetric (f ∈ F implies −f ∈ F), µ and g # ν are supported on [0, 1] d for all g ∈ G. Then, for any g n ∈ G satisfying (4.8),…”
Section: Generative Adversarial Networkmentioning
confidence: 94%
See 3 more Smart Citations