2022
DOI: 10.3390/diagnostics12061457
|View full text |Cite
|
Sign up to set email alerts
|

Deep Transfer Learning for the Multilabel Classification of Chest X-ray Images

Abstract: Chest X-ray (CXR) is widely used to diagnose conditions affecting the chest, its contents, and its nearby structures. In this study, we used a private data set containing 1630 CXR images with disease labels; most of the images were disease-free, but the others contained multiple sites of abnormalities. Here, we used deep convolutional neural network (CNN) models to extract feature representations and to identify possible diseases in these images. We also used transfer learning combined with large open-source i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 32 publications
0
9
0
Order By: Relevance
“…Huang GH et al. discovered that using migration learning in chest X-rays can enhance prediction capabilities and reduce computing costs ( 27 ).…”
Section: Introductionmentioning
confidence: 99%
“…Huang GH et al. discovered that using migration learning in chest X-rays can enhance prediction capabilities and reduce computing costs ( 27 ).…”
Section: Introductionmentioning
confidence: 99%
“…33 In these situations, deep learning models pretrained on nonmedical image datasets or medical image datasets from either a different imaging modality or same imaging modality but for different clinical tasks are fine-tuned with a relatively small medical imaging dataset for clinical decision-making tasks. [33][34][35][36][37][38][39] For example, Antropova et al 34 applied transfer learning on three different imaging modalities to extract deep features and fused them with human engineered radiomic features for the diagnostic classification of breast tumors, with results demonstrating statistically significant improved classification performance as compared to previous developed computer-aided diagnosis methods. Huang et al 35 applied deep transfer learning to identify possible disease on CXR images for multilabel classification task with improved prediction capacities.…”
Section: Introductionmentioning
confidence: 99%
“…[33][34][35][36][37][38][39] For example, Antropova et al 34 applied transfer learning on three different imaging modalities to extract deep features and fused them with human engineered radiomic features for the diagnostic classification of breast tumors, with results demonstrating statistically significant improved classification performance as compared to previous developed computer-aided diagnosis methods. Huang et al 35 applied deep transfer learning to identify possible disease on CXR images for multilabel classification task with improved prediction capacities. Samala et al 36 performed a multi-stage transfer learning for the classification of malignant and benign masses in digital breast tomosynthesis images and reported improved classification performance.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In another study, Huang et al [ 23 ] evaluated the gains achieved through transfer learning in a multi-label CXR classification task. They used a private CXR collection containing multiple abnormalities including aortic sclerosis/calcification, arterial curvature, consolidations, pulmonary fibrosis, enlarged hilar shadows, scoliosis, cardiomegaly, and intercostal pleural thickening, etc.…”
mentioning
confidence: 99%