2023
DOI: 10.1117/1.jei.32.5.053017
|View full text |Cite
|
Sign up to set email alerts
|

BcsUST: universal style transformation network for balanced content styles

Min Zhao,
XueZhong Qian,
Wei Song

Abstract: Composition, color, and brushstrokes are the primary factors in evaluating and appreciating artwork and are vital points to be considered and addressed in arbitrary style transfer tasks. However, existing methods do not balance these elements well, so we propose an approach that balances content structure and style patterns for universal style transformation (BcsUST). Specifically, two lightweight encoders based on residual networks pass the extracted style and content features into the multi-domain structure … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 46 publications
0
2
0
Order By: Relevance
“…Gatys et al 1 and others first applied CNN to style migration tasks, providing a new direction for deep image processing. From the initial single-model single-style algorithm to single-model multi-style algorithm, subsequently, [2][3][4][5][6][7][8][9][10][11][12][13][14] a large number of CNN-based arbitrary style image migration methods have been proposed, and this technological breakthrough has made style migration more flexible and practical. To enhance the style expression, some of them start from the brushstroke, [15][16][17] which generates meaningful stroke parameters in the vectorized environment, which are used to control the stroke size as well as the intensity, and further render the texture to present the unique brushstroke characteristics of the artist.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Gatys et al 1 and others first applied CNN to style migration tasks, providing a new direction for deep image processing. From the initial single-model single-style algorithm to single-model multi-style algorithm, subsequently, [2][3][4][5][6][7][8][9][10][11][12][13][14] a large number of CNN-based arbitrary style image migration methods have been proposed, and this technological breakthrough has made style migration more flexible and practical. To enhance the style expression, some of them start from the brushstroke, [15][16][17] which generates meaningful stroke parameters in the vectorized environment, which are used to control the stroke size as well as the intensity, and further render the texture to present the unique brushstroke characteristics of the artist.…”
Section: Introductionmentioning
confidence: 99%
“…Generative adversarial networks have also been widely used in style transfer tasks 4,23,[26][27][28][29][30][31][32] where the discriminator gradually learns to generate more realistic images by confronting and adjusting with the generator, and the discriminator gradually improves its judgment of the image's authenticity. The discriminator continuously improves its aesthetic ability similar to humans in the process and can use this feature to solve problem (1), the aesthetics of which directly affects the quality of the style migration result and determines whether the final generated image meets people's aesthetic standards.…”
Section: Introductionmentioning
confidence: 99%