Microscopic observation of mosquito species, which is the basis of morphological identification, is a time-consuming and challenging process, particularly owing to the different skills and experience of public health personnel. We present deep learning models based on the well-known you-only-look-once (YOLO) algorithm. This model can be used to simultaneously classify and localize the images to identify the species of the gender of field-caught mosquitoes. The results indicated that the concatenated two YOLO v3 model exhibited the optimal performance in identifying the mosquitoes, as the mosquitoes were relatively small objects compared with the large proportional environment image. The robustness testing of the proposed model yielded a mean average precision and sensitivity of 99% and 92.4%, respectively. The model exhibited high performance in terms of the specificity and accuracy, with an extremely low rate of misclassification. The area under the receiver operating characteristic curve (AUC) was 0.958 ± 0.011, which further demonstrated the model accuracy. Thirteen classes were detected with an accuracy of 100% based on a confusion matrix. Nevertheless, the relatively low detection rates for the two species were likely a result of the limited number of wild-caught biological samples available. The proposed model can help establish the population densities of mosquito vectors in remote areas to predict disease outbreaks in advance.
DNA double-strand breaks (DSBs) are the most lethal form of damage to cells from irradiation. γ-H2AX (phosphorylated form of H2AX histone variant) has become one of the most reliable and sensitive biomarkers of DNA DSBs. However, the γ-H2AX foci assay still has limitations in the time consumed for manual scoring and possible variability between scorers. This study proposed a novel automated foci scoring method using a deep convolutional neural network based on a You-Only-Look-Once (YOLO) algorithm to quantify γ-H2AX foci in peripheral blood samples. FociRad, a two-stage deep learning approach, consisted of mononuclear cell (MNC) and γ-H2AX foci detections. Whole blood samples were irradiated with X-rays from a 6 MV linear accelerator at 1, 2, 4 or 6 Gy. Images were captured using confocal microscopy. Then, dose–response calibration curves were established and implemented with unseen dataset. The results of the FociRad model were comparable with manual scoring. MNC detection yielded 96.6% accuracy, 96.7% sensitivity and 96.5% specificity. γ-H2AX foci detection showed very good F1 scores (> 0.9). Implementation of calibration curve in the range of 0–4 Gy gave mean absolute difference of estimated doses less than 1 Gy compared to actual doses. In addition, the evaluation times of FociRad were very short (< 0.5 min per 100 images), while the time for manual scoring increased with the number of foci. In conclusion, FociRad was the first automated foci scoring method to use a YOLO algorithm with high detection performance and fast evaluation time, which opens the door for large-scale applications in radiation triage.
The infection of an avian malaria parasite (Plasmodium gallinaceum) in domestic chickens presents a major threat to the poultry industry because it causes economic loss in both the quality and quantity of meat and egg production. Computer-aided diagnosis has been developed to automatically identify avian malaria infections and classify the blood infection stage development. In this study, four types of deep convolutional neural networks, namely Darknet, Darknet19, Darknet19-448 and Densenet201 are used to classify P. gallinaceum blood stages. We randomly collected a dataset of 12,761 single-cell images consisting of three parasite stages from ten-infected blood films stained by Giemsa. All images were confirmed by three well-trained examiners. The study mainly compared several image classification models and used both qualitative and quantitative data for the evaluation of the proposed models. In the model-wise comparison, the four neural network models gave us high values with a mean average accuracy of at least 97%. The Darknet can reproduce a superior performance in the classification of the P. gallinaceum development stages across any other model architectures. Furthermore, the Darknet has the best performance in multiple class-wise classification, with average values of greater than 99% in accuracy, specificity, and sensitivity. It also has a low misclassification rate (< 1%) than the other three models. Therefore, the model is more suitable in the classification of P. gallinaceum blood stages. The findings could help us create a fast-screening method to help non-experts in field studies where there is a lack of specialized instruments for avian malaria diagnostics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.