Coherent anti-Stokes Raman scattering (CARS) is an emerging tool for label-free characterization of living cells. Here, unsupervised multivariate analysis of CARS datasets was used to visualize the subcellular compartments. In addition, a supervised learning algorithm based on the "random forest" ensemble learning method as a classifier, was trained with CARS spectra using immunofluorescence images as a reference. The supervised classifier was then used, to our knowledge for the first time, to automatically identify lipid droplets, nucleus, nucleoli, and endoplasmic reticulum in datasets that are not used for training. These four subcellular components were simultaneously and label-free monitored instead of using several fluorescent labels. These results open new avenues for label-free time-resolved investigation of subcellular components in different cells, especially cancer cells.
The current gold standard for the diagnosis of bladder cancer is cystoscopy, which is invasive and painful for patients. Therefore, noninvasive urine cytology is usually used in the clinic as an adjunct to cystoscopy; however, it suffers from low sensitivity. Here, a novel noninvasive, label-free approach with high sensitivity for use with urine is presented. Coherent anti-Stokes Raman scattering imaging of urine sediments was used in the first step for fast preselection of urothelial cells, where high-grade urothelial cancer cells are characterized by a large nucleus-to-cytoplasm ratio. In the second step, Raman spectral imaging of urothelial cells was performed. A supervised classifier was implemented to automatically differentiate normal and cancerous urothelial cells with 100% accuracy. In addition, the Raman spectra not only indicated the morphological changes that are identified by cytology with hematoxylin and eosin staining but also provided molecular resolution through the use of specific marker bands. The respective Raman marker bands directly show a decrease in the level of glycogen and an increase in the levels of fatty acids in cancer cells as compared to controls. These results pave the way for "spectral" cytology of urine using Raman microspectroscopy.
A major promise of Raman microscopy is the label-free detailed recognition of cellular and subcellular structures. To this end, identifying colocalization patterns between Raman spectral images and fluorescence microscopic images is a key step to annotate subcellular components in Raman spectroscopic images. While existing approaches to resolve subcellular structures are based on fluorescence labeling, we propose a combination of a colocalization scheme with subsequent training of a supervised classifier that allows label-free resolution of cellular compartments. Our colocalization scheme unveils statistically significant overlapping regions by identifying correlation between the fluorescence color channels and clusters from unsupervised machine learning methods like hierarchical cluster analysis. The colocalization scheme is used as a pre-selection to gather appropriate spectra as training data. These spectra are used in the second part as training data to establish a supervised random forest classifier to automatically identify lipid droplets and nucleus. We validate our approach by examining Raman spectral images overlaid with fluorescence labelings of different cellular compartments, indicating that specific components may indeed be identified label-free in the spectral image. A Matlab implementation of our colocalization software is available at .
Hierarchical variants of so-called deep convolutional neural networks (DCNNs) have facilitated breakthrough results for numerous pattern recognition tasks in recent years. We assess the potential of these novel whole-image classifiers for Raman-microscopy-based cytopathology. Conceptually, DCNNs facilitate a flexible combination of spectral and spatial information for classifying cellular images as healthy or cancer-affected cells. As we demonstrate, this conceptual advantage translates into practice, where DCNNs exceed the accuracy of both conventional classifiers based on pixel spectra as well as classifiers based on morphological features extracted from Raman microscopic images. Remarkably, accuracies exceeding those of all previously proposed classifiers are obtained while using only a small fraction of the spectral information provided by the dataset. Overall, our results indicate a high potential for DCNNs in medical applications of not just Raman, but also infrared microscopy.
Cervical cancer is the fourth most common cancer in women worldwide, and early detection of its precancerous lesions can decrease mortality. Cytopathology, HPV testing, and histopathology are the most commonly used tools in clinical practice. However, these methods suffer from many limitations such as subjectivity, cost, and time. Therefore, there is an unmet clinical need to develop new noninvasive methods for the early detection of cervical cancer. Here, a novel noninvasive, fast, and label-free approach with high accuracy is presented using liquid-based cytology Pap smears. CARS and SHG/TPF imaging was performed at one wavenumber on the Pap smears from patients with specimens negative for intraepithelial lesions or malignancy (NILM), and low-grade (LSIL) and high-grade (HSIL) squamous intraepithelial lesions. The normal, LSIL, and HSIL cells were selected on the basis of the ratio of the nucleus to the cytoplasm and cell morphology. Raman spectral imaging of single cells from the same smears was also performed to provide integral biochemical information of cells. Deep convolutional neural networks (DCNNs) were trained independently with CARS, SHG/TPF, and Raman images, taking into account both morphotextural and spectral information. DCNNs based on CARS, SHG/TPF, or Raman images have discriminated between normal and cancerous Pap smears with 100% accuracy. These results demonstrate that CARS/SHG/TPF microscopy has a prospective use as a label-free imaging technique for the fast screening of a large number of cells in cytopathological samples.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.