Temporomandibular joint osteoarthritis (TMJ OA) is a degenerative condition of the TMJ led by a pathological tissue response of the joint under mechanical loading. It is characterized by the progressive destruction of the internal surfaces of the joint, which can result in debilitating pain and joint noise. Panoramic imaging can be used as a basic screening tool with thorough clinical examination in diagnosing TMJ OA. This paper proposes an algorithm that can extract the condylar region and determine its abnormality by using convolutional neural networks (CNNs) and Faster region-based CNNs (R-CNNs). Panoramic images are collected retrospectively and 1000 images are classified into three categories—normal, abnormal, and unreadable—by a dentist or orofacial pain specialist. Labels indicating whether the condyle is detected and its location enabled more clearly recognizable panoramic images. The uneven proportion of normal to abnormal data is adjusted by duplicating and rotating the images. An R-CNN model and a Visual Geometry Group-16 (VGG16) model are used for learning and condyle discrimination, respectively. To prevent overfitting, the images are rotated ±10° and shifted by 10%. The average precision of condyle detection using an R-CNN at intersection over union (IoU) >0.5 is 99.4% (right side) and 100% (left side). The sensitivity, specificity, and accuracy of the TMJ OA classification algorithm using a CNN are 0.54, 0.94, and 0.84, respectively. The findings demonstrate that classifying panoramic images through CNNs is possible. It is expected that artificial intelligence will be more actively applied to analyze panoramic X-ray images in the future.
Orthopantomogram (OPG) is important for primary diagnosis of temporomandibular joint osteoarthritis (TMJOA), because of cost and the radiation associated with computed tomograms (CT). The aims of this study were to develop an artificial intelligence (AI) model and compare its TMJOA diagnostic performance from OPGs with that of an oromaxillofacial radiology (OMFR) expert. An AI model was developed using Karas’ ResNet model and trained to classify images into three categories: normal, indeterminate OA, and OA. This study included 1189 OPG images confirmed by cone-beam CT and evaluated the results by model (accuracy, precision, recall, and F1 score) and diagnostic performance (accuracy, sensitivity, and specificity). The model performance was unsatisfying when AI was developed with 3 categories. After the indeterminate OA images were reclassified as normal, OA, or omission, the AI diagnosed TMJOA in a similar manner to an expert and was in most accord with CBCT when the indeterminate OA category was omitted (accuracy: 0.78, sensitivity: 0.73, and specificity: 0.82). Our deep learning model showed a sensitivity equivalent to that of an expert, with a better balance between sensitivity and specificity, which implies that AI can play an important role in primary diagnosis of TMJOA from OPGs in most general practice clinics where OMFR experts or CT are not available.
Background Early detection of tooth-related diseases in patients plays a key role in maintaining their dental health and preventing future complications. Since dentists are not overly attentive to tooth-related diseases that may be difficult to judge visually, many patients miss timely treatment. The 5 representative tooth-related diseases, that is, coronal caries or defect, proximal caries, cervical caries or abrasion, periapical radiolucency, and residual root can be detected on panoramic images. In this study, a web service was constructed for the detection of these diseases on panoramic images in real time, which helped shorten the treatment planning time and reduce the probability of misdiagnosis. Objective This study designed a model to assess tooth-related diseases in panoramic images by using artificial intelligence in real time. This model can perform an auxiliary role in the diagnosis of tooth-related diseases by dentists and reduce the treatment planning time spent through telemedicine. Methods For learning the 5 tooth-related diseases, 10,000 panoramic images were modeled: 4206 coronal caries or defects, 4478 proximal caries, 6920 cervical caries or abrasion, 8290 periapical radiolucencies, and 1446 residual roots. To learn the model, the fast region-based convolutional network (Fast R-CNN), residual neural network (ResNet), and inception models were used. Learning about the 5 tooth-related diseases completely did not provide accurate information on the diseases because of indistinct features present in the panoramic pictures. Therefore, 1 detection model was applied to each tooth-related disease, and the models for each of the diseases were integrated to increase accuracy. Results The Fast R-CNN model showed the highest accuracy, with an accuracy of over 90%, in diagnosing the 5 tooth-related diseases. Thus, Fast R-CNN was selected as the final judgment model as it facilitated the real-time diagnosis of dental diseases that are difficult to judge visually from radiographs and images, thereby assisting the dentists in their treatment plans. Conclusions The Fast R-CNN model showed the highest accuracy in the real-time diagnosis of dental diseases and can therefore play an auxiliary role in shortening the treatment planning time after the dentists diagnose the tooth-related disease. In addition, by updating the captured panoramic images of patients on the web service developed in this study, we are looking forward to increasing the accuracy of diagnosing these 5 tooth-related diseases. The dental diagnosis system in this study takes 2 minutes for diagnosing 5 diseases in 1 panoramic image. Therefore, this system plays an effective role in setting a dental treatment schedule.
BACKGROUND Lung cancer patients experience various symptoms during treatment. Although pulmonary rehabilitation is an effective way to improve these symptoms, a medical environment of limited availability makes it difficult to provide seamless and adequate rehabilitation for lung cancer patients. OBJECTIVE This study aimed to investigate the effects of a personalized pulmonary rehabilitation program using real-time mobile patient health data for patients with non–small cell lung cancer. METHODS We conducted a prospective clinical trial in 64 patients with non–small cell lung cancer aged between 20 and 80 years at a large tertiary hospital in Seoul, South Korea. A 12-week personalized pulmonary rehabilitation program, called efil breath, was administered to determine the effectiveness of the newly developed rehabilitation app. Participants were randomly allocated to the fixed exercise or fixed-interactive exercise group (which received the personalized program). We measured changes in 6-minute walk distance (6MWD) and dyspnea (modified Medical Research Council [mMRC] score) at 6 weeks; and quality of life and service satisfaction at 12 weeks. We used the paired t test to analyze the variables. RESULTS Patients used the newly developed mobile health pulmonary rehabilitation app and a real-time patient monitoring website. In all participants, significant changes were observed in 6MWD at 12 weeks from a mean of 433.43m (SD 65.60) to 471.25m (SD 75.69; P=.001), and mMRC from a mean score of 0.94 (0.66) to 0.61 (SD 0.82; P=.02). The intervention significantly improved their quality of life (EuroQol-visual analog scale [EQ-VAS]) compared with baseline (mean score 76.05, SD 12.37 vs 82.09, SD 13.67, respectively; P=.002). CONCLUSIONS A personalized mobile health–based pulmonary rehabilitation app for recording and monitoring real-time health data of patients with non–small cell lung cancer can supplement traditional health care center–based rehabilitation programs. This technology can encourage improvement of physical activity, dyspnea, and quality of life.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.