ObjectiveEsophagogastroduodenoscopy (EGD) is the pivotal procedure in the diagnosis of upper gastrointestinal lesions. However, there are significant variations in EGD performance among endoscopists, impairing the discovery rate of gastric cancers and precursor lesions. The aim of this study was to construct a real-time quality improving system, WISENSE, to monitor blind spots, time the procedure and automatically generate photodocumentation during EGD and thus raise the quality of everyday endoscopy.DesignWISENSE system was developed using the methods of deep convolutional neural networks and deep reinforcement learning. Patients referred because of health examination, symptoms, surveillance were recruited from Renmin hospital of Wuhan University. Enrolled patients were randomly assigned to groups that underwent EGD with or without the assistance of WISENSE. The primary end point was to ascertain if there was a difference in the rate of blind spots between WISENSE-assisted group and the control group.ResultsWISENSE monitored blind spots with an accuracy of 90.40% in real EGD videos. A total of 324 patients were recruited and randomised. 153 and 150 patients were analysed in the WISENSE and control group, respectively. Blind spot rate was lower in WISENSE group compared with the control (5.86% vs 22.46%, p<0.001), and the mean difference was −15.39% (95% CI −19.23 to −11.54). There was no significant adverse event.ConclusionsWISENSE significantly reduced blind spot rate of EGD procedure and could be used to improve the quality of everyday endoscopy.Trial registration numberChiCTR1800014809; Results.
Background Gastric cancer is the third most lethal malignancy worldwide. A novel deep convolution neural network (DCNN) to perform visual tasks has been recently developed. The aim of this study was to build a system using the DCNN to detect early gastric cancer (EGC) without blind spots during esophagogastroduodenoscopy (EGD).
Methods 3170 gastric cancer and 5981 benign images were collected to train the DCNN to detect EGC. A total of 24549 images from different parts of stomach were collected to train the DCNN to monitor blind spots. Class activation maps were developed to automatically cover suspicious cancerous regions. A grid model for the stomach was used to indicate the existence of blind spots in unprocessed EGD videos.
Results The DCNN identified EGC from non-malignancy with an accuracy of 92.5 %, a sensitivity of 94.0 %, a specificity of 91.0 %, a positive predictive value of 91.3 %, and a negative predictive value of 93.8 %, outperforming all levels of endoscopists. In the task of classifying gastric locations into 10 or 26 parts, the DCNN achieved an accuracy of 90 % or 65.9 %, on a par with the performance of experts. In real-time unprocessed EGD videos, the DCNN achieved automated performance for detecting EGC and monitoring blind spots.
Conclusions We developed a system based on a DCNN to accurately detect EGC and recognize gastric locations better than endoscopists, and proactively track suspicious cancerous lesions and monitor blind spots during EGD.
Background and study aims: Qualified esophagogastroduodenoscopy (EGD) is a prerequisite for detecting upper gastrointestinal lesions especially early gastric cancer (EGC). Our previous report showed that artificial intelligence system could monitor blind spots during EGD. Here, we updated the system to a new one (named ENDOANGEL), verified its effectiveness on improving endoscopy quality and pre-tested its performance on detecting EGC in a multi-center randomized controlled trial.
Patients and methods: ENDOANGEL was developed using deep convolutional neural networks and deep reinforcement learning. Patients undergoing EGD examination in 5 hospitals were randomly assigned to ENDOANGEL-assisted (EA) group or normal control (NC) group. The primary outcome was the number of blind spots. The second outcome includes performance of ENDOANGEL on predicting EGC in clinical setting.
Results: 1,050 patients were recruited and randomized. 498 and 504 patients in EA and NC groups were respectively analyzed. Compared with NC, the number of blind spots was less (5.382±4.315 vs. 9.821±4.978, p<0.001) and the inspection time was prolonged (5.400±3.821 min vs. 4.379±3.907 min, p<0.001) in EA group. In the 498 patients from EA group, 196 gastric lesions with pathological results were identified. ENDOANGEL correctly predicted all 3 EGC (1 mucosal carcinoma and 2 high-grade neoplasia) and 2 advanced gastric cancer, with a per-lesion accuracy of 84.69%, sensitivity of 100% and specificity of 84.29% for detecting GC.
Conclusions: The results of the multi-center study confirmed that ENDOANGEL is an effective and robust system to improve the quality of EGD and has the potential to detect EGC in real time.
This novel modality of thymoquinone pretreatment can enhance the anti-cancer activity of gemcitabine and may be a promising option in the treatment of pancreatic cancer.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.