2019
DOI: 10.48550/arxiv.1904.04751
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

User-Controllable Multi-Texture Synthesis with Generative Adversarial Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…A more recent line of work [Alanov et al 2019;Bergmann et al 2017;Frühstück et al 2019;Jetchev et al 2016;Li and Wand 2016;Li et al 2017b;Shaham et al 2019;Zhou et al 2018] has proposed using Generative Adversarial Networks (GANs) for more realistic texture synthesis while still suffering from the inability to generalize to new unseen textures. Zhou et al [Zhou et al 2018] learn a generator network that expands k × k texture blocks into 2k × 2k output through a combination of adversarial, L 1 , and style (gram matrix) loss.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…A more recent line of work [Alanov et al 2019;Bergmann et al 2017;Frühstück et al 2019;Jetchev et al 2016;Li and Wand 2016;Li et al 2017b;Shaham et al 2019;Zhou et al 2018] has proposed using Generative Adversarial Networks (GANs) for more realistic texture synthesis while still suffering from the inability to generalize to new unseen textures. Zhou et al [Zhou et al 2018] learn a generator network that expands k × k texture blocks into 2k × 2k output through a combination of adversarial, L 1 , and style (gram matrix) loss.…”
Section: Related Workmentioning
confidence: 99%
“…Other efforts [Alanov et al 2019;Bergmann et al 2017;Frühstück et al 2019;Jetchev et al 2016;Li et al 2017b] try to train on a set of texture images. During test time, the texture being generated is either chosen by the network [Bergmann et al 2017;Jetchev et al 2016] or user-controlled [Alanov et al 2019;Li et al 2017b]. [Frühstück et al 2019] propose a non-parametric method to synthesize large-scale, varied outputs by combining intermediate feature maps.…”
Section: Related Workmentioning
confidence: 99%
“…Most such approaches learn purely generative models, often trained on individual exemplars. An effective technique to incorporate a user-provided exemplar is conditional GANs [20], as shown by Alanov et al [21]. Here, G synthesizes a texture given an exemplar patch as input.…”
Section: Related Workmentioning
confidence: 99%