2022
DOI: 10.1080/21681163.2022.2040053
|View full text |Cite
|
Sign up to set email alerts
|

U-Net Convolutional Neural Networks for breast IR imaging segmentation on frontal and lateral view

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 12 publications
0
4
0
Order By: Relevance
“…To enhance the dataset, we focused on segmenting the region of interest (ROI) to accurately extract key anatomical features, thereby improving the accuracy of our analyses (Figure 3). For segmentation, we adapted a U-Net based method originally developed by Carvalho et al 15 While the original model used a 64 × 64 input size for angular images, we found this insufficient for our needs. Consequently, we employed a modified U-Net model with an increased input size of 256 × 256, allowing for more detailed extraction of segmentation masks.…”
Section: Image Preprocessingmentioning
confidence: 99%
“…To enhance the dataset, we focused on segmenting the region of interest (ROI) to accurately extract key anatomical features, thereby improving the accuracy of our analyses (Figure 3). For segmentation, we adapted a U-Net based method originally developed by Carvalho et al 15 While the original model used a 64 × 64 input size for angular images, we found this insufficient for our needs. Consequently, we employed a modified U-Net model with an increased input size of 256 × 256, allowing for more detailed extraction of segmentation masks.…”
Section: Image Preprocessingmentioning
confidence: 99%
“…Their IoU (intersection over union) value was 94.38% with average precision of 98.86%, and a recall of 98.46%. De Carvalho et al [15] applied U-Net to segment the breast region using thermal images taken from the front and lateral. Two different U-net architectures were trained and tested using 283 frontal view and 332 lateral view thermograms from the DMR-IR dataset.…”
Section: Segmentation Using Deep Learning Techniquesmentioning
confidence: 99%
“…(2) U-Net and other modified U-Net architectures demonstrate strong performance in thermal image segmentation. However, it was observed in [15] that the segmentation performance was affected negatively by lateral thermograms due to variations in orientation and limited information in lateral view, and demands separate analyses from frontal thermograms.…”
Section: Classificationmentioning
confidence: 99%
“…We compare the performance of different networks using two indicators: Acc and IoU 17 . Acc, also known as accuracy, refers to the proportion of correctly predicted pixels in total pixels.…”
Section: Evaluation Metricsmentioning
confidence: 99%