2022
DOI: 10.3390/app12104942
|View full text |Cite
|
Sign up to set email alerts
|

End-to-End Convolutional Neural Network Framework for Breast Ultrasound Analysis Using Multiple Parametric Images Generated from Radiofrequency Signals

Abstract: Breast ultrasound (BUS) is an effective clinical modality for diagnosing breast abnormalities in women. Deep-learning techniques based on convolutional neural networks (CNN) have been widely used to analyze BUS images. However, the low quality of B-mode images owing to speckle noise and a lack of training datasets makes BUS analysis challenging in clinical applications. In this study, we proposed an end-to-end CNN framework for BUS analysis using multiple parametric images generated from radiofrequency (RF) si… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 39 publications
0
4
0
Order By: Relevance
“…Kim et al. ( 18 ) employed CNN to process multiple parameter images generated from RF signals for benign and malignant breast tumor analysis, including grayscale, entropy, attenuation, and phase images. The highest accuracy and sensitivity were 83.00% and 92.24%, respectively.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Kim et al. ( 18 ) employed CNN to process multiple parameter images generated from RF signals for benign and malignant breast tumor analysis, including grayscale, entropy, attenuation, and phase images. The highest accuracy and sensitivity were 83.00% and 92.24%, respectively.…”
Section: Discussionmentioning
confidence: 99%
“…It greatly reduces the number of network parameters and the computational complexity during learning. Weight sharing can be manifested in various ways, such as sharing convolution kernel weights or weights of the entire network module [18,19]. A well-designed weight sharing structure enhances network depth, efficiency, and a lightweight architecture.…”
Section: Weight Sharing Network Frameworkmentioning
confidence: 99%
See 1 more Smart Citation
“…To address the limitations on utilizing B-mode images for AI input, several recent studies have employed radiofrequency (RF) data for convolutional neural network (CNN) based classification of malignant and benign breast masses. Kim et al ( 2022 ) generated parametric maps using RF data for the deep learning input, reporting a higher performance than B-mode input. Byra et al ( 2022 ) utilized RF data as deep learning input and achieved higher area under the curve (AUC) than B-mode for breast lesion classification and segmentation.…”
Section: Introductionmentioning
confidence: 99%