2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2021
DOI: 10.1109/cvpr46437.2021.00926
|View full text |Cite
|
Sign up to set email alerts
|

In the light of feature distributions: moment matching for Neural Style Transfer

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 46 publications
(17 citation statements)
references
References 19 publications
0
17
0
Order By: Relevance
“…Gatys also investigated what happens to generated images as a result of perceptual factors during style transfer as well as what happens to generated images afterward. e application of stylized images in real-world scenarios is severely constrained by the sluggishness of iterative optimization processes [10][11][12][13] due to the excessively slow rate at which stylized images are generated.…”
Section: Image Iteration and Generation Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Gatys also investigated what happens to generated images as a result of perceptual factors during style transfer as well as what happens to generated images afterward. e application of stylized images in real-world scenarios is severely constrained by the sluggishness of iterative optimization processes [10][11][12][13] due to the excessively slow rate at which stylized images are generated.…”
Section: Image Iteration and Generation Methodsmentioning
confidence: 99%
“…e loss function for this method, which trains an image modification model on a huge dataset and creates stylized images in the forward pass, can be defined using perceptual loss or Markov random field loss, depending on the circumstances. It is possible to transfer the style of images generated by generative adversarial networks (GANs) such as CycleGAN and DualGAN with great success [9][10][11][12][13][14]. But the dataset requirements and discriminative algorithms for GAN-based approaches are extremely rigorous.…”
Section: Introductionmentioning
confidence: 99%
“…NST, first introduced in Gatys et al [21], can be optimization-based [9,21,22] or modelbased [16,28,30,51]. It is inherently defined in the image domain by matching CNN features either globally or in a local, patch-based manner [21,31,34,36,41]. Thus, it cannot directly utilize 3D data like depth or surface normals of a mesh.…”
Section: Related Workmentioning
confidence: 99%
“…Inspired by previous works, several studies have synthesized realistic images using the properties of feature maps (Kalischek, Wegner, and Schindler 2021; Heitz et al 2021;Xu et al 2021). Many style transfer algorithms (Gatys, Ecker, and Bethge 2016;Lin et al 2021;Kalischek, Wegner, and Schindler 2021) extract the multi-scale feature maps on multiple layers by forwarding the style image to the network. Then they render the style from an image to a content image while preserving the semantic information of the content image.…”
Section: Introductionmentioning
confidence: 99%