2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2017
DOI: 10.1109/cvpr.2017.397
|View full text |Cite
|
Sign up to set email alerts
|

Controlling Perceptual Factors in Neural Style Transfer

Abstract: Neural Style Transfer has shown very exciting results enabling new forms of image manipulation. Here we extend the existing method to introduce control over spatial location, colour information and across spatial scale 12 . We demonstrate how this enhances the method by allowing high-resolution controlled stylisation and helps to alleviate common failure cases such as applying ground textures to sky regions. Furthermore, by decomposing style into these perceptual factors we enable the combination of style info… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
281
0
1

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
3
1

Relationship

1
9

Authors

Journals

citations
Cited by 420 publications
(284 citation statements)
references
References 23 publications
2
281
0
1
Order By: Relevance
“…Our method (as well as [1]) does not take semantic information into account, so it may transfer semantically different regions from the style images to target images. This could be improved by using semantic maps [18,19]. Presentation ability of our model.…”
Section: Resultsmentioning
confidence: 99%
“…Our method (as well as [1]) does not take semantic information into account, so it may transfer semantically different regions from the style images to target images. This could be improved by using semantic maps [18,19]. Presentation ability of our model.…”
Section: Resultsmentioning
confidence: 99%
“…For now, we will briefly introduce the two specific technologies used in our first gallery event, Deep Dream (in partnership with Gray Area Foundation for the Arts in San Francisco, http://grayarea.org/event/deepdreamthe-art-of-neural-networks) 7 . These are "Inceptionism" or "Deep Dreaming", first developed by Alex Mordvintsev at Google's Zurich office (Mordvintsev et al 2015), and "style transfer", first developed by Leon Gatys and collaborators in the Bethge Lab at the Centre for Integrative Neuroscience in Tübingen (Gatys et al 2016). It is fitting and likely a sign of things to come that one of these developments came from a computer scientist working on a neurally inspired algorithm for image classification, while the other came from a grad student in neuroscience working on computational models of the brain.…”
Section: Point Outmentioning
confidence: 99%
“…However, this technique is sensitive to mismatches in the image content and several approaches have been proposed to address this issue. Gatys et al [GEB*17] add the possibility for users to guide the transfer with annotations. In the context of photographic transfer, Luan et al [LPSB17] limit mismatches using scene analysis.…”
Section: Introductionmentioning
confidence: 99%