Objective: This study evaluated the use of a deep-learning approach for automated detection and numbering of deciduous teeth in children as depicted on panoramic radiographs. Methods and materials: An artificial intelligence (AI) algorithm (CranioCatch, Eskisehir-Turkey) using Faster R-CNN Inception v2 (COCO) models were developed to automatically detect and number deciduous teeth as seen on pediatric panoramic radiographs. The algorithm was trained and tested on a total of 421 panoramic images. System performance was assessed using a confusion matrix. Results: The AI system was successful in detecting and numbering the deciduous teeth of children as depicted on panoramic radiographs. The sensitivity and precision rates were high. The estimated sensitivity, precision, and F1 score were 0.9804, 0.9571, and 0.9686, respectively. Conclusion: Deep-learning-based AI models are a promising tool for the automated charting of panoramic dental radiographs from children. In addition to serving as a time-saving measure and an aid to clinicians, AI plays a valuable role in forensic identification.
Background Panoramic radiography is an imaging method for displaying maxillary and mandibular teeth together with their supporting structures. Panoramic radiography is frequently used in dental imaging due to its relatively low radiation dose, short imaging time, and low burden to the patient. We verified the diagnostic performance of an artificial intelligence (AI) system based on a deep convolutional neural network method to detect and number teeth on panoramic radiographs. Methods The data set included 2482 anonymized panoramic radiographs from adults from the archive of Eskisehir Osmangazi University, Faculty of Dentistry, Department of Oral and Maxillofacial Radiology. A Faster R-CNN Inception v2 model was used to develop an AI algorithm (CranioCatch, Eskisehir, Turkey) to automatically detect and number teeth on panoramic radiographs. Human observation and AI methods were compared on a test data set consisting of 249 panoramic radiographs. True positive, false positive, and false negative rates were calculated for each quadrant of the jaws. The sensitivity, precision, and F-measure values were estimated using a confusion matrix. Results The total numbers of true positive, false positive, and false negative results were 6940, 250, and 320 for all quadrants, respectively. Consequently, the estimated sensitivity, precision, and F-measure were 0.9559, 0.9652, and 0.9606, respectively. Conclusions The deep convolutional neural network system was successful in detecting and numbering teeth. Clinicians can use AI systems to detect and number teeth on panoramic radiographs, which may eventually replace evaluation by human observers and support decision making.
The purpose of the paper was the assessment of the success of an artificial intelligence (AI) algorithm formed on a deep-convolutional neural network (D-CNN) model for the segmentation of apical lesions on dental panoramic radiographs. A total of 470 anonymized panoramic radiographs were used to progress the D-CNN AI model based on the U-Net algorithm (CranioCatch, Eskisehir, Turkey) for the segmentation of apical lesions. The radiographs were obtained from the Radiology Archive of the Department of Oral and Maxillofacial Radiology of the Faculty of Dentistry of Eskisehir Osmangazi University. A U-Net implemented with PyTorch model (version 1.4.0) was used for the segmentation of apical lesions. In the test data set, the AI model segmented 63 periapical lesions on 47 panoramic radiographs. The sensitivity, precision, and F1-score for segmentation of periapical lesions at 70% IoU values were 0.92, 0.84, and 0.88, respectively. AI systems have the potential to overcome clinical problems. AI may facilitate the assessment of periapical pathology based on panoramic radiographs.
Bite-wing radiographs are one of the most used intraoral radiography techniques in dentistry. AI is extremely important in terms of more efficient patient care in the field of dentistry. The aim of this study was to perform a diagnostic evaluation on bite-wing radiographs with an AI model based on CNNs. In this study, 500 bite-wing radiographs in the radiography archive of Eskişehir Osmangazi University, Faculty of Dentistry, Department of Oral and Maxillofacial Radiology were used. The CranioCatch labeling program (CranioCatch, Eskisehir, Turkey) with tooth decays, crowns, pulp, restoration material, and root-filling material for five different diagnoses were made by labeling the segmentation technique. The U-Net architecture was used to develop the AI model. F1 score, sensitivity, and precision results of the study, respectively, caries 0.8818–0.8235–0.9491, crown; 0.9629–0.9285–1, pulp; 0.9631–0.9843–0.9429, with restoration material; and 0.9714–0.9622–0.9807 was obtained as 0.9722–0.9459–1 for the root filling material. This study has shown that an AI model can be used to automatically evaluate bite-wing radiographs and the results are promising. Owing to these automatically prepared charts, physicians in a clinical intense tempo will be able to work more efficiently and quickly.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.