Over 200 million malaria cases globally lead to half a million deaths annually. Accurate malaria diagnosis remains a challenge. Automated imaging processing approaches to analyze Thick Blood Films (TBF) could provide scalable solutions, for urban healthcare providers in the holoendemic malaria sub‐Saharan region. Although several approaches have been attempted to identify malaria parasites in TBF, none have achieved negative and positive predictive performance suitable for clinical use in the west sub‐Saharan region. While malaria parasite object detection remains an intermediary step in achieving automatic patient diagnosis, training state‐of‐the‐art deep‐learning object detectors requires the human‐expert labor‐intensive process of labeling a large dataset of digitized TBF. To overcome these challenges and to achieve a clinically usable system, we show a novel approach. It leverages routine clinical‐microscopy labels from our quality‐controlled malaria clinics, to train a Deep Malaria Convolutional Neural Network classifier (DeepMCNN) for automated malaria diagnosis. Our system also provides total Malaria Parasite (MP) and White Blood Cell (WBC) counts allowing parasitemia estimation in MP/μL, as recommended by the WHO. Prospective validation of the DeepMCNN achieves sensitivity/specificity of 0.92/0.90 against expert‐level malaria diagnosis. Our approach PPV/NPV performance is of 0.92/0.90, which is clinically usable in our holoendemic settings in the densely populated metropolis of Ibadan. It is located within the most populous African country (Nigeria) and with one of the largest burdens of Plasmodium falciparum malaria. Our openly available method is of importance for strategies aimed to scale malaria diagnosis in urban regions where daily assessment of thousands of specimens is required.
While optical microscopy inspection of blood films and bone marrow aspirates by a hematologist is a crucial step in establishing diagnosis of acute leukemia, especially in low-resource settings where other diagnostic modalities are not available, the task remains time-consuming and prone to human inconsistencies. This has an impact especially in cases of Acute Promyelocytic Leukemia (APL) that require urgent treatment. Integration of automated computational hematopathology into clinical workflows can improve the throughput of these services and reduce cognitive human error. However, a major bottleneck in deploying such systems is a lack of sufficient cell morphological object-labels annotations to train deep learning models. We overcome this by leveraging patient diagnostic labels to train weakly-supervised models that detect different types of acute leukemia. We introduce a deep learning approach, Multiple Instance Learning for Leukocyte Identification (MILLIE), able to perform automated reliable analysis of blood films with minimal supervision. Without being trained to classify individual cells, MILLIE differentiates between acute lymphoblastic and myeloblastic leukemia in blood films. More importantly, MILLIE detects APL in blood films (AUC 0.94 ± 0.04) and in bone marrow aspirates (AUC 0.99 ± 0.01). MILLIE is a viable solution to augment the throughput of clinical pathways that require assessment of blood film microscopy.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.