2021
DOI: 10.1007/s00330-021-08271-4
|View full text |Cite
|
Sign up to set email alerts
|

Predicting the molecular subtype of breast cancer and identifying interpretable imaging features using machine learning algorithms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
36
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 56 publications
(36 citation statements)
references
References 38 publications
0
36
0
Order By: Relevance
“…In addition, to understand how the models yielded their predictions, SHAP values have been proposed as the most effective method for a visual explanation of the model and for presenting properties of local accuracy and consistency [ 46 , 47 ]. Indeed, some studies have used SHAP values to select important features for predicting BC molecular subtypes from images [ 48 ].…”
Section: Discussionmentioning
confidence: 99%
“…In addition, to understand how the models yielded their predictions, SHAP values have been proposed as the most effective method for a visual explanation of the model and for presenting properties of local accuracy and consistency [ 46 , 47 ]. Indeed, some studies have used SHAP values to select important features for predicting BC molecular subtypes from images [ 48 ].…”
Section: Discussionmentioning
confidence: 99%
“…Recently, Ma et al used ML to differentiate between breast cancer molecular subtypes based on mammography and ultrasound. Both clinical data and imaging signs based on the BI-RADS lexicon served as inputs for the ML models [37] . In another breast cancer study, perfusion MRI radiomics were used to infer tumor infiltrating lymphocytes [38] .…”
Section: Imaging Acquisition Optimizationmentioning
confidence: 99%
“…Each subtype has different clinical characteristics, prognosis, and response to treatment. Different imaging features of each subtype are also reported in the literature [ 3 , 4 , 5 ].…”
Section: Introductionmentioning
confidence: 99%