Objectives: Cephalometric analysis is essential for diagnosis, treatment planning and outcome assessment of orthodontics and orthognathic surgery. Utilizing artificial intelligence (AI) to achieve automated landmark localization has proved feasible and convenient. However, current systems remain insufficient for clinical application, as patients exhibit various malocclusions in cephalograms produced by different manufacturers while limited cephalograms were applied to train AI in these systems. Methods: A robust and clinically applicable artificial intelligence system was proposed for automatic cephalometric analysis. First, 9870 cephalograms taken by different radiography machines with various malocclusions of patients were collected from 20 medical institutions. Then 30 landmarks of all these cephalogram samples were manually annotated to train an artificial intelligence system, composed of a two-stage convolutional neural network (CNN) and a software-as-a-service (SaaS) system. Further, more than 100 orthodontists participated to refine the AI-output landmark localizations and re-train this system. Results: The average landmark prediction error of this system was as low as 0.94 ± 0.74 mm and the system achieved an average classification accuracy of 89.33%. Conclusions: An automatic cephalometric analysis system based on CNN was proposed, which can realize automatic landmark location and cephalometric measurements classification. This system showed promise in improving diagnostic efficiency in clinical circumstances.
Objective: Cephalometric analysis has been significantly facilitated by artificial intelligence (AI) in recent years. For digital cephalograms, linear measurements are conducted based on the length calibration process, which hasn’t been automatized in current AI-based systems. Therefore, this study aimed to develop an automated calibration system for lateral cephalometry to conduct linear measurements more efficiently. Approach: This system was based on deep learning algorithms and medical priors of a stable structure, the anterior cranial base (sella–nasion). First, a two-stage cascade convolutional neural network was constructed based on 2860 cephalograms to locate sella, nasion, and 2 ruler points in regions of interest (ROIs). Further, sella-nasion distance was applied to estimate the distance between ruler points, and then pixels size of cephalograms was attained for linear measurements. The accuracy of automated landmark localization, ruler length prediction, and linear measurement based on automated calibration was evaluated with statistical analysis. Main results: First, for AI-located points, 99.6% of S and 86% of N points deviated less than 2 mm from the ground truth, and 99% of ruler points deviated less than 0.3 mm from the ground truth. Also, this system correctly predicted the ruler length of 98.95% of samples. Based on automated calibration, 11 linear cephalometric measurements of the test set showed no difference from manual calibration (p > 0.05). Significance: This system was the first reported in the literature to conduct automated calibration with high accuracy and showed high potential for clinical application in cephalometric analysis.
Background Sexual dimorphism is obvious not only in the overall architecture of human body, but also in intraoral details. Many studies have found a correlation between gender and morphometric features of teeth, such as mesio-distal diameter, buccal-lingual diameter and height. However, it’s still difficult to detect gender through the observation of intraoral photographs, with accuracy around 50%. The purpose of this study was to explore the possibility of automatically telling gender from intraoral photographs by deep neural network, and to provide a novel angle for individual oral treatment. Methods A deep learning model based on R-net was proposed, using the largest dataset (10,000 intraoral images) to support the automatic detection of gender. In order to reverse analyze the classification basis of neural network, Gradient-weighted Class Activation Mapping (Grad-CAM) was used in the second step, exploring anatomical factors associated with gender recognizability. The simulated modification of images based on features suggested was then conducted to verify the importance of characteristics between two genders. Precision (specificity), recall (sensitivity) and receiver operating characteristic (ROC) curves were used to evaluate the performance of our network. Chi-square test was used to evaluate intergroup difference. A value of p < 0.05 was considered statistically significant. Results The deep learning model showed a strong ability to learn features from intraoral images compared with human experts, with an accuracy of 86.5% and 82.5% in uncropped image data group and cropped image data group respectively. Compared with hard tissue exposed in the mouth, gender difference in areas covered by soft tissue was easier to identify, and more significant in mandibular region than in maxillary region. For photographs with simulated removal of lips and basal bone along with overlapping gingiva, mandibular anterior teeth had similar importance for sex determination as maxillary anterior teeth. Conclusions Deep learning method could detect gender from intraoral photographs with high efficiency and accuracy. With assistance of Grad-CAM, the classification basis of neural network was deciphered, which provided a more precise entry point for individualization of prosthodontic, periodontal and orthodontic treatments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.