The aim of this study is to propose an alternative and hybrid solution method for diagnosing the disease from histopathology images taken from animals with paratuberculosis and intact intestine. In detail, the hybrid method is based on using both image processing and deep learning for better results. Reliable disease detection from histo-pathology images is known as an open problem in medical image processing and alternative solutions need to be developed. In this context, 520 histopathology images were collected in a joint study with
Artificial intelligence holds great promise in medical imaging, especially histopathological imaging. However, artificial intelligence algorithms cannot fully explain the thought processes during decision-making. This situation has brought the problem of explainability, i.e., the black box problem, of artificial intelligence applications to the agenda: an algorithm simply responds without stating the reasons for the given images. To overcome the problem and improve the explainability, explainable artificial intelligence (XAI) has come to the fore, and piqued the interest of many researchers. Against this backdrop, this study examines a new and original dataset using the deep learning algorithm, and visualizes the output with gradient-weighted class activation mapping (Grad-CAM), one of the XAI applications. Afterwards, a detailed questionnaire survey was conducted with the pathologists on these images. Both the decision-making processes and the explanations were verified, and the accuracy of the output was tested. The research results greatly help pathologists in the diagnosis of paratuberculosis.
Artificial intelligence and its sub-branches, machine learning and deep learning, have proven themselves in many different areas such as medical imaging systems, face recognition, autonomous driving. Especially deep learning models have become very popular today. Because deep learning models are very complex in nature, they are one of the best examples of black-box models. This situation leaves the end user in doubt in terms of interpretability and explainability. Therefore, the need to make such systems understandable methods with explainable artificial intelligence (XAI) has been widely developed in recent years. In this context, a hybrid method has been developed as a result of the study, and classification study has been carried out on the new and original dataset over different deep learning algorithms. Grad-CAM application was performed on VGG16 architecture with classification accuracy of 99.643% and heat maps of pre-processed images were obtained by CLAHE method.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.