We demonstrate residual channel attention networks (RCAN) for restoring and enhancing volumetric time-lapse (4D) fluorescence microscopy data. First, we modify RCAN to handle image volumes, showing that our network enables denoising competitive with three other state-of-the-art neural networks. We use RCAN to restore noisy 4D super-resolution data, enabling image capture over tens of thousands of images (thousands of volumes) without apparent photobleaching. Second, using simulations we show that RCAN enables class-leading resolution enhancement, superior to other networks. Third, we exploit RCAN for denoising and resolution improvement in confocal microscopy, enabling ~2.5-fold lateral resolution enhancement using stimulated emission depletion (STED) microscopy ground truth. Fourth, we develop methods to improve spatial resolution in structured illumination microscopy using expansion microscopy ground truth, achieving improvements of ~1.4-fold laterally and ~3.4-fold axially. Finally, we characterize the limits of denoising and resolution enhancement, suggesting practical benchmarks for evaluating and further enhancing network performance.data, which we deconvolved to yield high SNR 'ground truth'. We then used 30 of these volumes for training and held out volumes for testing network performance. Using the same training and test data, we compared four networks: RCAN, CARE, SRResNET 20 , and ESRGAN 21 . SRResNet and ESRGAN are both class-leading deep residual networks used in image super-resolution, with ESRGAN winning the 2018 Perceptual Image Restoration and Manipulation challenge on perceptual image super-resolution 22 .For the mEmerald-Tomm20 label, RCAN, CARE, ESRGAN, and SRResNET predictions all provided 88 clear improvements in visual appearance, structural similarity index (SSIM) and peak signal-to-noise-89 ratio (PSNR) metrics relative to the raw input (Fig. 1b), also outperforming direct deconvolution on the noisy input data (Supplementary Fig. 1). The RCAN output provided PSNR and SSIM values competitive with the other networks (Fig. 1b), prompting us to investigate whether this performance held for other organelles. We thus conducted similar experiments for fixed U2OS cells with labeled actin, endoplasmic reticulum (ER), golgi, lysosomes, and microtubules (Supplementary Fig. 2), acquiring 15-23 volumes of training data and training independent networks for each organelle. In almost all cases, RCAN performance met or exceeded the other networks (Supplementary Fig. 3, Supplementary Table 3).An essential consideration when using any deep learning method is understanding when network performance deteriorates. Independently training an ensemble of networks and computing measures of network disagreement can provide insight into this issue 9,16 , yet such measures were not generally predictive of disagreement between ground truth and RCAN output (Supplementary Fig. 4). Instead, we found that estimating the per-pixel SNR in the raw input (Methods, Supplementary Fig. 4) seemed to better correlate with network ...
Sensory modulation is essential for animal sensations, behaviours and survival. Peripheral modulations of nociceptive sensations and aversive behaviours are poorly understood. Here we identify a biased cross-inhibitory neural circuit between ASH and ASI sensory neurons. This inhibition is essential to drive normal adaptive avoidance of a CuSO 4 (Cu 2 þ ) challenge in Caenorhabditis elegans. In the circuit, ASHs respond to Cu 2 þ robustly and suppress ASIs via electro-synaptically exciting octopaminergic RIC interneurons, which release octopamine (OA), and neuroendocrinally inhibit ASI by acting on the SER-3 receptor. In addition, ASIs sense Cu 2 þ and permit a rapid onset of Cu 2 þ -evoked responses in Cu 2 þ -sensitive ADF neurons via neuropeptides possibly, to inhibit ASHs. ADFs function as interneurons to mediate ASI inhibition of ASHs by releasing serotonin (5-HT) that binds with the SER-5 receptor on ASHs. This elaborate modulation among sensory neurons via reciprocal inhibition fine-tunes the nociception and avoidance behaviour.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.