The potential to differentiate between diseased and healthy tissue has been demonstrated through the extraction of morphological and functional metrics from label-free, two-photon images. Acquiring such images as fast as possible without compromising their diagnostic and functional content is critical for clinical translation of two-photon imaging. Computational restoration methods have demonstrated impressive recovery of image quality and important biological information. However, access to large clinical datasets has hampered advancement of denoising algorithms. Here, we seek to demonstrate the application of denoising algorithms on depth-resolved two-photon excited fluorescence (TPEF) images with specific focus on recovery of functional metabolic metrics. Datasets were generated through the collection of images of reduced nicotinamide adenine dinucleotide (phosphate) (NAD(P)H) and flavoproteins from freshly excised rat cheek epithelium. Image datasets were patched across depth, generating 1012, 256-by-256 patches. A well-known Unet architecture was trained on 6628 low-signal-to-noise-ratio (SNR) patches from a previously collected large dataset and later retrained on a smaller 620 low-SNR patches dataset before being validated and evaluated on 88 and 304 low-SNR patches, respectively, using a structural similarity index measure (SSIM) loss function. We demonstrate models trained on larger datasets of human cervical tissue could be used to successfully restore metabolic metrics with an improvement in image quality when applied to rat cheek epithelium images. These results motivate further exploration of weight transfer for denoising of small clinical two-photon microscopy datasets.
Label-free, two-photon imaging captures morphological and functional metabolic tissue changes and enables enhanced understanding of numerous diseases. However, this modality suffers from low signal arising from limitations imposed by the maximum permissible dose of illumination and the need for rapid image acquisition to avoid motion artifacts. Recently, deep learning methods have been developed to facilitate the extraction of quantitative information from such images. Here, we employ deep neural architectures in the synthesis of a multiscale denoising algorithm optimized for restoring metrics of metabolic activity from low-SNR, two-photon images. Two-photon excited fluorescence (TPEF) images of reduced nicotinamide adenine dinucleotide (phosphate) (NAD(P)H) and flavoproteins (FAD) from freshly excised human cervical tissues are used. We assess the impact of the specific denoising model, loss function, data transformation, and training dataset on established metrics of image restoration when comparing denoised single frame images with corresponding six frame averages, considered as the ground truth. We further assess the restoration accuracy of six metrics of metabolic function from the denoised images relative to ground truth images. Using a novel algorithm based on deep denoising in the wavelet transform domain, we demonstrate optimal recovery of metabolic function metrics. Our results highlight the promise of denoising algorithms to recover diagnostically useful information from low SNR label-free two-photon images and their potential importance in the clinical translation of such imaging.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.