2021
DOI: 10.1109/tip.2021.3065845
|View full text |Cite
|
Sign up to set email alerts
|

Inspirational Adversarial Image Generation

Abstract: The task of image generation started to receive some attention from artists and designers to inspire them in new creations. However, exploiting the results of deep generative models such as Generative Adversarial Networks can be long and tedious given the lack of existing tools. In this work, we propose a simple strategy to inspire creators with new generations learned from a dataset of their choice, while providing some control on them. We design a simple optimization method to find the optimal latent paramet… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
2

Relationship

2
6

Authors

Journals

citations
Cited by 16 publications
(8 citation statements)
references
References 35 publications
0
8
0
Order By: Relevance
“…So, research works must focus on how handling complex conditions as well as data sources should be inspired. This challenge is treated with some studies for fashion intelligent system such as Semantic-Spatial Aware GAN [23] and Inspirational adversarial image generation [63].…”
Section: Challengesmentioning
confidence: 99%
“…So, research works must focus on how handling complex conditions as well as data sources should be inspired. This challenge is treated with some studies for fashion intelligent system such as Semantic-Spatial Aware GAN [23] and Inspirational adversarial image generation [63].…”
Section: Challengesmentioning
confidence: 99%
“…• As pointed out in [19] and [37] in the context of computer vision, evolutionary algorithms provide solutions robust to imperfect objective functions. More precisely, by focusing on optima stable by random variable-wise perturbations, evolutionary algorithms behave well on the real objective function (in particular, human assessment) when we optimize a proxy (here, our criterion).…”
Section: B Optimizer Choice: Evolutionary Computationmentioning
confidence: 99%
“…Evolutionary methods [14] are known as the jeep of artificial intelligence [41], [25]: they are compatible with rugged objective functions without gradient and search for flat, stable optima [19]. Moreover, the work [37] optimizing the latent space of GANs finds that evolutionary methods are especially robust to imperfect surrogate objective functions. Our experiments also support the use of evolutionary methods for optimizing noise injection in GANs: they show that Diagonal CMA is well suited to optimizing our rugged objective criterion, as it outperforms gradient-based methods on many datasets.…”
Section: Introductionmentioning
confidence: 99%
“…Image translation methods [19,48,40,59,53,22,29,44] typically resort to conditional generative adversarial network and optimize the network through either paired data with explicit supervision or unpaired data by enforcing cycle consistency. Recently, exemplar-based image translation [18,41,47,33,43,1,54] attracted a lot of interest due to its flexibility and improved generation quality. While most methods transfer the global style from the reference image, a recent work, CoCos-Net [57] proposes establishing the dense semantic correspondence to the cross-domain inputs, and thus better preserves the fine structures from the exemplar.…”
Section: Related Workmentioning
confidence: 99%