2020
DOI: 10.48550/arxiv.2011.07466
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Continuous Conditional Generative Adversarial Networks: Novel Empirical Losses and Label Input Mechanisms

Abstract: This work proposes the concept of continuous conditional generative adversarial network (CcGAN), the first generative model for image generation conditional on continuous, scalar conditions (termed regression labels). Existing conditional GANs (cGANs) are mainly designed for categorical conditions (e.g., class labels); conditioning on regression labels is mathematically distinct and raises two fundamental problems: (P1) Since there may be very few (even zero) real images for some regression labels, minimizing … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
43
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(45 citation statements)
references
References 26 publications
2
43
0
Order By: Relevance
“…With three examples presented, we observe that the ILI model performs better than the NLI for all cases except one particular case (Table 1). This finding is in good agreement with classification problems in Ding et al (2020). Ding et al (2020) speculates that the ILI overcomes the label inconsistency of the classification problems while the NLI could not.…”
Section: Prediction Accuracy and Geophysical Applicationssupporting
confidence: 84%
See 4 more Smart Citations
“…With three examples presented, we observe that the ILI model performs better than the NLI for all cases except one particular case (Table 1). This finding is in good agreement with classification problems in Ding et al (2020). Ding et al (2020) speculates that the ILI overcomes the label inconsistency of the classification problems while the NLI could not.…”
Section: Prediction Accuracy and Geophysical Applicationssupporting
confidence: 84%
“…This finding is in good agreement with classification problems in Ding et al (2020). Ding et al (2020) speculates that the ILI overcomes the label inconsistency of the classification problems while the NLI could not. Our reasoning for the outperformance of the INI stems from continuous conditional batch normalization, which can carry out temporal information more consistently than the element-wise addition in the NLI model.…”
Section: Prediction Accuracy and Geophysical Applicationssupporting
confidence: 84%
See 3 more Smart Citations