This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
This article presents a systematic overview of artificial intelligence (AI) and computer vision strategies for diagnosing the coronavirus disease of 2019 (COVID-19) using computerized tomography (CT) medical images. We analyzed the previous review works and found that all of them ignored classifying and categorizing COVID-19 literature based on computer vision tasks, such as classification, segmentation, and detection. Most of the COVID-19 CT diagnosis methods comprehensively use segmentation and classification tasks. Moreover, most of the review articles are diverse and cover CT as well as X-ray images. Therefore, we focused on the COVID-19 diagnostic methods based on CT images. Well-known search engines and databases such as Google, Google Scholar, Kaggle, Baidu, IEEE Xplore, Web of Science, PubMed, ScienceDirect, and Scopus were utilized to collect relevant studies. After deep analysis, we collected 114 studies and reported highly enriched information for each selected research. According to our analysis, AI and computer vision have substantial potential for rapid COVID-19 diagnosis as they could significantly assist in automating the diagnosis process. Accurate and efficient models will have real-time clinical implications, though further research is still required. Categorization of literature based on computer vision tasks could be helpful for future research; therefore, this review article will provide a good foundation for conducting such research.
Background: Autophagy, a highly conserved self-digesting process, has been deeply involved in the development and progression of oral squamous cell carcinoma (OSCC). However, the prognostic value of autophagy-related genes (ARGs) for OSCC still remains unclear. Our study set out to develop a multigene expression signature based on ARGs for individualized prognosis assessment in OSCC patients. Methods: Based on The Cancer Genome Atlas (TCGA) database, we identified prognosis-related ARGs through univariate COX regression analysis. Then we performed the least absolute shrinkage and selection operator (LASSO) regression analysis to identify an optimal autophagy-related multigene signature with the subsequent validation in testing set, GSE41613 and GSE42743 datasets. Results: We identified 36 prognosis-related ARGs for OSCC. Subsequently, the multigene signature based on 13 prognostic ARGs was constructed and successfully divided OSCC patients into low and high-risk groups with significantly different overall survival in TCGA training set (p < 0.0001). The autophagy signature remained as an independent prognostic factor for OSCC in univariate and multivariate Cox regression analyses. The area under the curve (AUC) values of the receiver operating characteristic (ROC) curves for 1, 3, and 5-year survival were 0.758, 0.810, 0.798, respectively. Then the gene signature was validated in TCGA testing set, GSE41613 and GSE42743 datasets. Moreover, Gene Ontology (GO), Kyoto Encyclopedia of Genes and Genomes (KEGG) analysis, and single-sample gene set enrichment analysis (ssGSEA) revealed the underlying biological characteristics and signaling pathways associated with this signature in OSCC. Finally, we constructed a nomogram by combining the gene signature with multiple clinical parameters (age, gender, TNM-stage, tobacco, and alcohol history). The concordance index (C-index) and calibration plots demonstrated favorable predictive performance of our nomogram.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.