2023
DOI: 10.1016/j.compbiomed.2022.106456
|View full text |Cite
|
Sign up to set email alerts
|

FusionSegNet: Fusing global foot features and local wound features to diagnose diabetic foot

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 24 publications
0
3
0
Order By: Relevance
“…The low-frequency information corresponds to the overall shape of the wound, while the high-frequency information corresponds to the details of the wound. The combination of high and low-frequency information from wound images provides a more comprehensive wound characterization, which is essential for wound classification [36].…”
Section: Proposed Architecturementioning
confidence: 99%
“…The low-frequency information corresponds to the overall shape of the wound, while the high-frequency information corresponds to the details of the wound. The combination of high and low-frequency information from wound images provides a more comprehensive wound characterization, which is essential for wound classification [36].…”
Section: Proposed Architecturementioning
confidence: 99%
“…In conv2d_1 = (none, 75, 75, 32), conv2d_2 = (none, 38, 38, 64), conv2d_3 = (none, 19,19,128) layers. In max_pooling2d = (none, 38, 38, 32), max_pooling2d_1 = (none, 19,19,64), max_pooling2d_3 = (none, 10,10,128). The convolutional neural networks (CNNs) consist of 2D matrix features where each of the features is weighted as a sum of the input pixels.…”
Section: Introductionmentioning
confidence: 99%
“…The endorsed system classified the non-DF and DF images with a better AUC (area under receiver operating characteristics) rate. With such an ideal performance, the suggested methodology can extract the wound features better, thereby improving classification performance [10]. Though conventional models have endeavored to work better in DFU classification, there is a scope for enhancement in accordance with the accuracy rate.…”
Section: Introductionmentioning
confidence: 99%