Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Application 2021
DOI: 10.5220/0010167902110218
|View full text |Cite
|
Sign up to set email alerts
|

Combating Mode Collapse in GAN Training: An Empirical Analysis using Hessian Eigenvalues

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 22 publications
(13 citation statements)
references
References 0 publications
0
13
0
Order By: Relevance
“…The training of a GAN model is not a simple task to perform, due to a complicated nature of finding an optimal point upon which both G and D can converge on, which many studies describe as an equivalent to the search for Nash equilibrium in a high-dimensional, highly non-convex optimization space [ 25 ]. Both G and D need to be closely monitored and trained in relation to each other.…”
Section: Methodsmentioning
confidence: 99%
“…The training of a GAN model is not a simple task to perform, due to a complicated nature of finding an optimal point upon which both G and D can converge on, which many studies describe as an equivalent to the search for Nash equilibrium in a high-dimensional, highly non-convex optimization space [ 25 ]. Both G and D need to be closely monitored and trained in relation to each other.…”
Section: Methodsmentioning
confidence: 99%
“…The authors in [ 24 ] evaluated and compared the performance of WGAN, AdaGAN, VEEGAN, and Unrolled GAN concerning mode collapse limitations. The experiments were performed over both synthetic and real datasets using the same architectures and training procedures for the GAN algorithms.…”
Section: Gan Framework and Problem Statementmentioning
confidence: 99%
“…For the coverage metrics (also measures the distance between real and generated sample distribution), VEEGAN recorded the highest coverage on the 2D ring dataset, WGAN and AdaGAN recorded the highest coverage on the 2D ring dataset. The WGAN generated low quality samples [ 24 ].…”
Section: Gan Framework and Problem Statementmentioning
confidence: 99%
See 2 more Smart Citations