2021 International Conference on Advancements in Electrical, Electronics, Communication, Computing and Automation (ICAECA) 2021
DOI: 10.1109/icaeca52838.2021.9675723
|View full text |Cite
|
Sign up to set email alerts
|

Neural Style Transfer Using VGG19 and Alexnet

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(7 citation statements)
references
References 2 publications
0
3
0
Order By: Relevance
“…Note that while the CycleGAN does not require a corresponding style image to generate stylized images as it directly learns transferable representations from the content image, we observed that the CycleGAN generated suboptimal results for our problem compared to the neural style transfer algorithm. For our study, we used a combination of two different approaches for synthetic image generation with the neural style transfer algorithm: A VGG-19 model architecture (Simonyan and Zisserman, 2014; Kavitha et al, 2021), originally pretrained with weights from ImageNet for image classification. A pretrained fast style transfer model leveraging arbitrary image stylization (Ghiasi et al, 2017) (no fine-tuning required). …”
Section: Proposed Methodology and Learning Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…Note that while the CycleGAN does not require a corresponding style image to generate stylized images as it directly learns transferable representations from the content image, we observed that the CycleGAN generated suboptimal results for our problem compared to the neural style transfer algorithm. For our study, we used a combination of two different approaches for synthetic image generation with the neural style transfer algorithm: A VGG-19 model architecture (Simonyan and Zisserman, 2014; Kavitha et al, 2021), originally pretrained with weights from ImageNet for image classification. A pretrained fast style transfer model leveraging arbitrary image stylization (Ghiasi et al, 2017) (no fine-tuning required). …”
Section: Proposed Methodology and Learning Modelsmentioning
confidence: 99%
“…1. A VGG-19 model architecture (Simonyan and Zisserman, 2014;Kavitha et al, 2021), originally pretrained with weights from ImageNet for image classification. 2.…”
Section: Neural Style Transfer Algorithmmentioning
confidence: 99%
“…As its name suggests, VGG19 includes 19 layers. Instead of using large filters, the VGG19 model uses multiple 3 × 3 filters per layer, with a size of 11 × 11 like Alexnet [32,33].…”
Section: Proposed Ensemble Modelmentioning
confidence: 99%
“…Style transfer is a program that can automatically extract style features in an image and transfer these features to a different image and make this image with a new style look normal [1]. The main content of the style transfer is the convolution neural network, which can extract the eigenvectors in the image layer by layer via the convolution kernel in the hidden layer [2]. A well-trained style transfer model could transfer the fitted functions in the training to the new pixel matrix through a multilinear operation.…”
Section: Introductionmentioning
confidence: 99%
“…Current style transfer methods maintain the structure information and main content information and convert the information of colors, lines, light, etc, which only serves as a filter instead of recreating the image. This research aims to reduce the randomness in style transfer by converting the image preprocessing methods and multi-applying the VGG network which is suitable for image processing [2]. In the research, we deconstructed the concept of style by digitizing information in the artistic field and extracting structured information from the concept to extract the key elements of style through the linear operation.…”
Section: Introductionmentioning
confidence: 99%