2021
DOI: 10.48550/arxiv.2106.05319
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Stein Latent Optimization for Generative Adversarial Networks

Abstract: Generative adversarial networks (GANs) with clustered latent spaces can perform conditional generation in a completely unsupervised manner. However, the salient attributes of unlabeled data in the real-world are mostly imbalanced. Existing unsupervised conditional GANs cannot properly cluster the attributes in their latent spaces because they assume uniform distributions of the attributes. To address this problem, we theoretically derive Stein latent optimization that provides reparameterizable gradient estima… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 22 publications
0
3
0
Order By: Relevance
“…Self-Conditioned GAN (Liu et al, 2020) uses clustering of discriminative features as labels to train. SLOGAN (Hwang et al, 2021) proposes a new conditional contrastive loss (U2C) to learn latent distribution of the data. Note that compared to our work, ClusterGAN and SLOGAN introduce an additional encoder that leads to increased computational complexity.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Self-Conditioned GAN (Liu et al, 2020) uses clustering of discriminative features as labels to train. SLOGAN (Hwang et al, 2021) proposes a new conditional contrastive loss (U2C) to learn latent distribution of the data. Note that compared to our work, ClusterGAN and SLOGAN introduce an additional encoder that leads to increased computational complexity.…”
Section: Related Workmentioning
confidence: 99%
“…By sampling from these compact models, we can conditionally regenerate meaningful samples from computed clusters. This is known as unsupervised conditional image generation (Hwang et al, 2021).…”
Section: Sample-wise Constraints For Unsupervised Transcriptionmentioning
confidence: 99%
See 1 more Smart Citation