Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
BackgroundConvolutional neural networks (CNN) have achieved remarkable success in medical image analysis. However, unlike some general‐domain tasks where model accuracy is paramount, medical applications demand both accuracy and explainability due to the high stakes affecting patients' lives. Based on model explanations, clinicians can evaluate the diagnostic decisions suggested by CNN. Nevertheless, prior explainable artificial intelligence methods treat medical image tasks akin to general vision tasks, following end‐to‐end paradigms to generate explanations and frequently overlooking crucial clinical domain knowledge.MethodsWe propose a plug‐and‐play module that explicitly integrates anatomic boundary information into the explanation process for CNN‐based thoracopathy classifiers. To generate the anatomic boundary of the lung parenchyma, we utilize a lung segmentation model developed on external public datasets and deploy it on the unseen target dataset to constrain model explanations within the lung parenchyma for the clinical task of thoracopathy classification.ResultsAssessed by the intersection over union and dice similarity coefficient between model‐extracted explanations and expert‐annotated lesion areas, our method consistently outperformed the baseline devoid of clinical domain knowledge in 71 out of 72 scenarios, encompassing 3 CNN architectures (VGG‐11, ResNet‐18, and AlexNet), 2 classification settings (binary and multi‐label), 3 explanation methods (Saliency Map, Grad‐CAM, and Integrated Gradients), and 4 co‐occurred thoracic diseases (Atelectasis, Fracture, Mass, and Pneumothorax).ConclusionsWe underscore the effectiveness of leveraging radiology knowledge in improving model explanations for CNN and envisage that it could inspire future efforts to integrate clinical domain knowledge into medical image analysis.
BackgroundConvolutional neural networks (CNN) have achieved remarkable success in medical image analysis. However, unlike some general‐domain tasks where model accuracy is paramount, medical applications demand both accuracy and explainability due to the high stakes affecting patients' lives. Based on model explanations, clinicians can evaluate the diagnostic decisions suggested by CNN. Nevertheless, prior explainable artificial intelligence methods treat medical image tasks akin to general vision tasks, following end‐to‐end paradigms to generate explanations and frequently overlooking crucial clinical domain knowledge.MethodsWe propose a plug‐and‐play module that explicitly integrates anatomic boundary information into the explanation process for CNN‐based thoracopathy classifiers. To generate the anatomic boundary of the lung parenchyma, we utilize a lung segmentation model developed on external public datasets and deploy it on the unseen target dataset to constrain model explanations within the lung parenchyma for the clinical task of thoracopathy classification.ResultsAssessed by the intersection over union and dice similarity coefficient between model‐extracted explanations and expert‐annotated lesion areas, our method consistently outperformed the baseline devoid of clinical domain knowledge in 71 out of 72 scenarios, encompassing 3 CNN architectures (VGG‐11, ResNet‐18, and AlexNet), 2 classification settings (binary and multi‐label), 3 explanation methods (Saliency Map, Grad‐CAM, and Integrated Gradients), and 4 co‐occurred thoracic diseases (Atelectasis, Fracture, Mass, and Pneumothorax).ConclusionsWe underscore the effectiveness of leveraging radiology knowledge in improving model explanations for CNN and envisage that it could inspire future efforts to integrate clinical domain knowledge into medical image analysis.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.