2022
DOI: 10.1080/10618600.2022.2061496
|View full text |Cite
|
Sign up to set email alerts
|

Dermoscopic Image Classification with Neural Style Transfer

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 45 publications
0
2
0
Order By: Relevance
“…In Kruthika et al [ 27 ], autoencoders were used to extract features from 3D patches, obtaining a better result than when using traditional 2D patches. In [ 28 ], the authors show that the performance of the classification problem in case of dermoscopic analyses significantly outperforms that of raw images. Researchers have also used LSTM (long short-term memory) layers on top of three-dimensional convolutional layers to improve the performance of models.…”
Section: Methodsmentioning
confidence: 99%
“…In Kruthika et al [ 27 ], autoencoders were used to extract features from 3D patches, obtaining a better result than when using traditional 2D patches. In [ 28 ], the authors show that the performance of the classification problem in case of dermoscopic analyses significantly outperforms that of raw images. Researchers have also used LSTM (long short-term memory) layers on top of three-dimensional convolutional layers to improve the performance of models.…”
Section: Methodsmentioning
confidence: 99%
“…In Kruthika et al [19], auto encoders were used to extract features from the 3D patches showing a better result than the traditional 2D patches. In [22], author shows that the performance of the classification problem in case of dermoscopic analyses outperforms that of the raw images significantly. Researchers have also used LSTM (long short term memory) layers on top of three dimensional convolutional layers to improve the performance of the models.…”
Section: Classification and Feature Extractionmentioning
confidence: 99%