Currently, there are no fast and accurate screening methods available for head and neck cancer, the eighth most common tumor entity. For this study, we used hyperspectral imaging, an imaging technique for quantitative and objective surface analysis, combined with deep learning methods for automated tissue classification. As part of a prospective clinical observational study, hyperspectral datasets of laryngeal, hypopharyngeal and oropharyngeal mucosa were recorded in 98 patients before surgery in vivo. We established an automated data interpretation pathway that can classify the tissue into healthy and tumorous using convolutional neural networks with 2D spatial or 3D spatio‐spectral convolutions combined with a state‐of‐the‐art Densenet architecture. Using 24 patients for testing, our 3D spatio‐spectral Densenet classification method achieves an average accuracy of 81%, a sensitivity of 83% and a specificity of 79%.
Preprint. Accepted for publication in IJCARS.
Abstract PurposeThe gold standard for colorectal cancer metastases detection in the peritoneum is histological evaluation of a removed tissue sample. For feedback during interventions, real-time in-vivo imaging with confocal laser microscopy has been proposed for differentiation of benign and malignant tissue by manual expert evaluation. Automatic image classification could improve the surgical workflow further by providing immediate feedback.MethodsWe analyze the feasibility of classifying tissue from confocal laser microscopy in the colon and peritoneum. For this purpose we adopt both classical and stateof-the-art convolutional neural networks to directly learn from the images. As the available dataset is small, we investigate several transfer learning strategies including partial freezing variants and full fine-tuning. We address the distinction of different tissue types, as well as benign and malignant tissue.
ResultsWe present a thorough analysis of transfer learning strategies for colorectal cancer with confocal laser microscopy. In the peritonuem, metastases are classified with an AUC of 97.1 and in the colon the primarius is classified with an AUC of 73.1. In general, transfer learning substantially improves performance over training from scratch. We find that the optimal transfer learning strategy differs for models and classification tasks.
ConclusionsWe demonstrate that convolutional neural networks and transfer learning can be used to identify cancer tissue with confocal laser microscopy. We show that there is no generally optimal transfer learning strategy and model as well as taskspecific engineering is required. Given the high performance for the peritoneum, Nils Gessert,
Medulloblastoma (MB) is the most common malignant brain tumor in childhood. The diagnosis is generally based on the microscopic evaluation of histopathological tissue slides. However, visual-only assessment of histopathological patterns is a tedious and time-consuming task and is also affected by observer variability. Hence, automated MB tumor classification could assist pathologists by promoting consistency and robust quantification. Recently, convolutional neural networks (CNNs) have been proposed for this task, while transfer learning has shown promising results. In this work, we propose an end-to-end MB tumor classification and explore transfer learning with various input sizes and matching network dimensions. We focus on differentiating between the histological subtypes classic and desmoplastic/nodular. For this purpose, we systematically evaluate recently proposed Effi-cientNets, which uniformly scale all dimensions of a CNN. Using a data set with 161 cases, we demonstrate that pre-trained EfficientNets with larger input resolutions lead to significant performance improvements compared to commonly used pre-trained CNN architectures. Also, we highlight the importance of transfer learning, when using such large architectures. Overall, our best performing method achieves an F1-Score of 80.1%.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.