2022
DOI: 10.3390/electronics11182929
|View full text |Cite
|
Sign up to set email alerts
|

LPAdaIN: Light Progressive Attention Adaptive Instance Normalization Model for Style Transfer

Abstract: To improve the generation quality of image style transfer, this paper proposes a light progressive attention adaptive instance normalization (LPAdaIN) model that combines the adaptive instance normalization (AdaIN) layer and the convolutional block attention module (CBAM). In the construction of the model structure, first, a lightweight autoencoder is built to reduce the information loss in the encoding process by reducing the number of network layers and to alleviate the distortion of the stylized image struc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 16 publications
0
0
0
Order By: Relevance
“…After the transpose convolution operation, instance normalization without learnable parameters is applied across the height and width dimensions to the feature maps corresponding to MRI and PET modalities generated at the higher resolution. Instance normalization without learnable parameters is a variant of instance normalization, in which the scaling and shifting factors are not learned but are instead fixed and applied in a predetermined manner [40,41]. In this study, instance normalization helps normalize the activations of individual instances independently, without introducing any learnable parameters.…”
Section: Basic Image Feature Map Extraction Based On Optimized Transp...mentioning
confidence: 99%
“…After the transpose convolution operation, instance normalization without learnable parameters is applied across the height and width dimensions to the feature maps corresponding to MRI and PET modalities generated at the higher resolution. Instance normalization without learnable parameters is a variant of instance normalization, in which the scaling and shifting factors are not learned but are instead fixed and applied in a predetermined manner [40,41]. In this study, instance normalization helps normalize the activations of individual instances independently, without introducing any learnable parameters.…”
Section: Basic Image Feature Map Extraction Based On Optimized Transp...mentioning
confidence: 99%