2019
DOI: 10.1016/j.eswa.2019.01.048
|View full text |Cite
|
Sign up to set email alerts
|

An interpretable deep hierarchical semantic convolutional neural network for lung nodule malignancy classification

Abstract: While deep learning methods are increasingly being applied to tasks such as computer-aided diagnosis, these models are difficult to interpret, do not incorporate prior domain knowledge, and are often considered as a "black-box." The lack of model interpretability hinders them from being fully understood by target users such as radiologists. In this paper, we present a novel interpretable deep hierarchical semantic convolutional neural network (HSCNN) to predict whether a given pulmonary nodule observed on a co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
149
1
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 218 publications
(151 citation statements)
references
References 39 publications
0
149
1
1
Order By: Relevance
“…More specifically, predictive models for EGFR and KRAS mutation status in lung cancer were developed. Following the current direction in the literature, where the analysis only focuses on the nodule structure and texture 36,37 , we started by using objective radiomic features directly extracted from nodules in CT scans. Then, semantic features, annotated during radiologist evaluation, were used as input.…”
mentioning
confidence: 99%
“…More specifically, predictive models for EGFR and KRAS mutation status in lung cancer were developed. Following the current direction in the literature, where the analysis only focuses on the nodule structure and texture 36,37 , we started by using objective radiomic features directly extracted from nodules in CT scans. Then, semantic features, annotated during radiologist evaluation, were used as input.…”
mentioning
confidence: 99%
“…Different researchers studied different factors relevant to wound healing e.g., wet and dry environment effect on different type of skin wound, role of pH in wound healing, role of moisture balance in continuous healing. In the recent past, a few wound care solutions focused on exterior factors effect on wound healing or sensor-based wound care solutions [11][12][13][14][15][16][17][18][19] have been presented, however, the previous methods are only focused on single factor effect at a time and didn't provide and learning based investigation for wound care, as showed in Table 1.…”
Section: Related Workmentioning
confidence: 99%
“…Therefore, many attempts have been made to develop computer-aided diagnosis (CAD) systems for automatic discrimination. [3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18][19][20] Conventional CAD systems first use classical image processing techniques, such as morphologic operators, 3-5 region growing, 6 energy optimization, 7,8 and statistical learning, 9,10 to segment a region of interest (ROI) that includes the nodule. Then, handcrafted features are extracted from the ROI, which are then fed to a classifier for nodule classification.…”
Section: Introductionmentioning
confidence: 99%