2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2016
DOI: 10.1109/cvpr.2016.66
|View full text |Cite
|
Sign up to set email alerts
|

Split and Match: Example-Based Adaptive Patch Sampling for Unsupervised Style Transfer

Abstract: This paper presents a novel unsupervised method to transfer the style of an example image to a source image. The complex notion of image style is here considered as a local texture transfer, eventually coupled with a global color transfer. For the local texture transfer, we propose a new method based on an adaptive patch partition that captures the style of the example image and preserves the structure of the source image. More precisely, this example-based partition predicts how well a source patch matches an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

2
86
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 121 publications
(88 citation statements)
references
References 24 publications
2
86
0
Order By: Relevance
“…However, most of these NPR stylisation algorithms are designed for particular artistic styles [3], [4] and cannot be easily extended to other styles. In the community of computer vision, style transfer is usually studied as a generalised problem of texture synthesis, which is to extract and transfer the texture from the source to target [5], [6], [7], [8]. Hertzmann et al [9] further propose a framework named image analogies to perform a generalised style transfer by learning the analogous transformation from the provided example pairs of unstylised and stylised images.…”
Section: Introductionmentioning
confidence: 99%
“…However, most of these NPR stylisation algorithms are designed for particular artistic styles [3], [4] and cannot be easily extended to other styles. In the community of computer vision, style transfer is usually studied as a generalised problem of texture synthesis, which is to extract and transfer the texture from the source to target [5], [6], [7], [8]. Hertzmann et al [9] further propose a framework named image analogies to perform a generalised style transfer by learning the analogous transformation from the provided example pairs of unstylised and stylised images.…”
Section: Introductionmentioning
confidence: 99%
“…Earlier methods [18] are usually non-parametric and are build upon low-level image features. Inspired by Image Analogies [18], approaches [11,30,39,40] are based on finding dense correspondence between content and style image and often require image pairs to depict similar content. Therefore, these methods do not scale to the setting of arbitrary content images.…”
Section: Related Workmentioning
confidence: 99%
“…Project page: http://www.icst.pku.edu.cn/struct/Projects/UTS.html Style [2]. Non-parametric methods take samples from the style image and place the samples based on pixel intensity [1], [3], [4] or deep features [5] of the target image to synthesize a new image. Parametric methods represent the style as statistical features, and adjust the target image to satisfy these features.…”
Section: Introductionmentioning
confidence: 99%