Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/391
|View full text |Cite
|
Sign up to set email alerts
|

Learning Generative Adversarial Networks from Multiple Data Sources

Abstract: Generative Adversarial Networks (GANs) are a powerful class of deep generative models. In this paper, we extend GAN to the problem of generating data that are not only close to a primary data source but also required to be different from auxiliary data sources. For this problem, we enrich both GANs' formulations and applications by introducing pushing forces that thrust generated samples away from given auxiliary data sources. We term our method Push-and-Pull GAN (P2GAN). We conduct extensive experiments to de… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 5 publications
0
3
0
Order By: Relevance
“…MEGAN [31] adopts a gating network that produces a one-hot vector to select the generator creating the best example. P2GAN [32] sequentially adds a new generator, which is different from the existing generators, to cover the missing modes of the real data.…”
Section: Gan With Multiple Generatorsmentioning
confidence: 99%
“…MEGAN [31] adopts a gating network that produces a one-hot vector to select the generator creating the best example. P2GAN [32] sequentially adds a new generator, which is different from the existing generators, to cover the missing modes of the real data.…”
Section: Gan With Multiple Generatorsmentioning
confidence: 99%
“…Unlike DAgger (Ross, Gordon, and Bagnell 2011), can simply ask the expert for such actions. GAIL suffers from the problems of mode collapse and low sample efficiency in terms of environment interaction (Le et al 2019). The weakness of mode collapse is inheriting from GANs, and several works have built on GAIL to overcome this problem (Li, Song, and Ermon 2017) (Fei et al 2020).…”
Section: Introductionmentioning
confidence: 99%
“…GANs are known to be affected by the mode collapsing problem [5,7,10,17]. In particular, the study in [17] recently studied the mode collapsing problem and further classified this into the missing mode problem i.e., the generated samples miss some modes in the true data, and the boundary distortion problem i.e., the generated samples can only partly recover some modes in the true data.…”
Section: Introductionmentioning
confidence: 99%