The infection of an avian malaria parasite (Plasmodium gallinaceum) in domestic chickens presents a major threat to the poultry industry because it causes economic loss in both the quality and quantity of meat and egg production. Computer-aided diagnosis has been developed to automatically identify avian malaria infections and classify the blood infection stage development. In this study, four types of deep convolutional neural networks, namely Darknet, Darknet19, Darknet19-448 and Densenet201 are used to classify P. gallinaceum blood stages. We randomly collected a dataset of 12,761 single-cell images consisting of three parasite stages from ten-infected blood films stained by Giemsa. All images were confirmed by three well-trained examiners. The study mainly compared several image classification models and used both qualitative and quantitative data for the evaluation of the proposed models. In the model-wise comparison, the four neural network models gave us high values with a mean average accuracy of at least 97%. The Darknet can reproduce a superior performance in the classification of the P. gallinaceum development stages across any other model architectures. Furthermore, the Darknet has the best performance in multiple class-wise classification, with average values of greater than 99% in accuracy, specificity, and sensitivity. It also has a low misclassification rate (< 1%) than the other three models. Therefore, the model is more suitable in the classification of P. gallinaceum blood stages. The findings could help us create a fast-screening method to help non-experts in field studies where there is a lack of specialized instruments for avian malaria diagnostics.
Mosquito-borne diseases such as dengue fever and malaria are the top 10 leading causes of death in low-income countries. Control measure for the mosquito population plays an essential role in the fight against the disease. Currently, several intervention strategies; chemical-, biological-, mechanical- and environmental methods remain under development and need further improvement in their effectiveness. Although, a conventional entomological surveillance, required a microscope and taxonomic key for identification by professionals, is a key strategy to evaluate the population growth of these mosquitoes, these techniques are tedious, time-consuming, labor-intensive, and reliant on skillful and well-trained personnel. Here, we proposed an automatic screening, namely the deep metric learning approach and its inference under the image-retrieval process with Euclidean distance-based similarity. We aimed to develop the optimized model to find suitable miners and suggested the robustness of the proposed model by evaluating it with unseen data under a 20-returned image system. During the model development, well-trained ResNet34 are outstanding and no performance difference when comparing five data miners that showed up to 98% in its precision even after testing the model with both image sources: stereomicroscope and mobile phone cameras. The robustness of the proposed—trained model was tested with secondary unseen data which showed different environmental factors such as lighting, image scales, background colors and zoom levels. Nevertheless, our proposed neural network still has great performance with greater than 95% for sensitivity and precision, respectively. Also, the area under the ROC curve given the learning system seems to be practical and empirical with its value greater than 0.960. The results of the study may be used by public health authorities to locate mosquito vectors nearby. If used in the field, our research tool in particular is believed to accurately represent a real-world scenario.
Background Object detection is a new artificial intelligence approach to morphological recognition and labeling parasitic pathogens. Due to the lack of equipment and trained personnel, artificial intelligence innovation for searching various parasitic products in stool examination will enable patients in remote areas of undeveloped countries to access diagnostic services. Because object detection is a developing approach that has been tested for its effectiveness in detecting intestinal parasitic objects such as protozoan cysts and helminthic eggs, it is suitable for use in rural areas where many factors supporting laboratory testing are still lacking. Based on the literatures, the YOLOv4-Tiny produces faster results and uses less memory with the support of low-end GPU devices. In comparison to the YOLOv3 and YOLOv3-Tiny models, this study aimed to propose an automated object detection approach, specifically the YOLOv4-Tiny model, for automatic recognition of intestinal parasitic products in stools. Methods To identify protozoan cysts and helminthic eggs in human feces, the three YOLO approaches; YOLOv4-Tiny, YOLOv3, and YOLOv3-Tiny, were trained to recognize 34 intestinal parasitic classes using training of image dataset. Feces were processed using a modified direct smear method adapted from the simple direct smear and the modified Kato-Katz methods. The image dataset was collected from intestinal parasitic objects discovered during stool examination and the three YOLO models were trained to recognize the image datasets. Results The non-maximum suppression technique and the threshold level were used to analyze the test dataset, yielding results of 96.25% precision and 95.08% sensitivity for YOLOv4-Tiny. Additionally, the YOLOv4-Tiny model had the best AUPRC performance of the three YOLO models, with a score of 0.963. Conclusion This study, to our knowledge, was the first to detect protozoan cysts and helminthic eggs in the 34 classes of intestinal parasitic objects in human stools.
The infection of an avian malaria parasite (Plasmodium gallinaceum) in domestic chickens presents a major threat to the poultry industry because it causes economic loss in both the quality and quantity of meat and egg production. Computer-aided diagnosis has been developed to automatically identify avian malaria infections and classify the blood infection stage development. In this study, four types of deep convolutional neural networks, namely Darknet, Darknet19, Darknet19-448 and Densenet201 are used to classify P. gallinaceum blood stages. We randomly collected a dataset of 12,761 single-cell images consisting of four parasite stages from ten-infected blood films stained by Giemsa. All images were confirmed by three well-trained examiners. The study mainly compared several image classification models and used both qualitative and quantitative data for the evaluation of the proposed models. In the model-wise comparison, the four neural network models gave us high values with the mean average accuracy of at least 97%. The Darknet can reproduce a superior performance in the classification of the P. gallinaceum development stages across any other model architectures. Furthermore, the Darknet has the best performance in multiple class-wise classification, with average values of greater than 99% in accuracy, specificity, and sensitivity. It also has a low misclassification rate (< 1%) than the other three models. Therefore, the model is more suitable in the classification of P. gallinaceum blood stages. The findings could help us create a fast screening method to help non-experts in field studies where there is a lack of specialized instruments for avian malaria diagnostics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.