SIGGRAPH Asia 2018 Technical Briefs 2018
DOI: 10.1145/3283254.3283282
|View full text |Cite
|
Sign up to set email alerts
|

On the convergence and mode collapse of GAN

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 34 publications
(14 citation statements)
references
References 1 publication
0
14
0
Order By: Relevance
“…WGAN and WGAN-GP. A common problem in the training of GANs is the so-called mode collapse [15]. When it happens, the generator begins to synthesize the same images regardless of the input data.…”
Section: Gan Architectures Overviewmentioning
confidence: 99%
“…WGAN and WGAN-GP. A common problem in the training of GANs is the so-called mode collapse [15]. When it happens, the generator begins to synthesize the same images regardless of the input data.…”
Section: Gan Architectures Overviewmentioning
confidence: 99%
“…By using this strategy, their method can compute the KL and reverse KL divergence simultaneously, which in turn increases the variety of samples. Based on this idea, Zhang et al [31] propose a D2GAN variation with two customized discriminators. Specifically, one discriminator consists of residual blocks to form a deep network aiming to increase the variety of generated samples.…”
Section: A Mode Collapse In Gansmentioning
confidence: 99%
“…otherwise, it will be difficult to converge [47]. While training the discriminator, the Generator needs to be constant because it needs to learn to differentiate between the generated and fake data.…”
Section: Training the Ganmentioning
confidence: 99%
“…Previous research [40], [44], [47] suggests that training a traditional GAN is hard. The most common problems are as follows.…”
Section: Generating Data Using Ganmentioning
confidence: 99%