2021 IEEE/CVF International Conference on Computer Vision (ICCV) 2021
DOI: 10.1109/iccv48922.2021.00658
|View full text |Cite
|
Sign up to set email alerts
|

AdaAttN: Revisit Attention Mechanism in Arbitrary Neural Style Transfer

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

3
170
0
1

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 248 publications
(174 citation statements)
references
References 28 publications
3
170
0
1
Order By: Relevance
“…For the latter, self-attention mechanism in style transfer gives outstanding performance in assigning different style patterns to different regions in an image. SANet [28], SAFIN [10], AdaAttN [26] are examples. SANet [28] firstly introduced the first attention method for feature embedding in style transfer.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…For the latter, self-attention mechanism in style transfer gives outstanding performance in assigning different style patterns to different regions in an image. SANet [28], SAFIN [10], AdaAttN [26] are examples. SANet [28] firstly introduced the first attention method for feature embedding in style transfer.…”
Section: Related Workmentioning
confidence: 99%
“…SANet [28], SAFIN [10], AdaAttN [26] are examples. SANet [28] firstly introduced the first attention method for feature embedding in style transfer. SAFIN [10] extended SANet [28] by using a self-attention module to learn how to create the parameters for our spatially adaptive normalization module, which makes normalization more spatially flexible and semantically aware.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…ResNeSt applies channel attention to different network branches, succeeding in crossfeature interaction. Furthermore, attention mechanisms [14][15][16] are also introduced into image classification tasks. The attention mechanism focuses on important information with high weight, ignores irrelevant information with low weight, and learn independently, and can continuously adjust the weight.…”
Section: Introductionmentioning
confidence: 99%