Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
Owing to the high cost of modern magnetic resonance imaging (MRI) systems, their use in clinical care and neurodevelopmental research is limited to hospitals and universities in high income countries. Ultra‐low‐field systems with significantly lower scanning costs present a promising avenue towards global MRI accessibility; however, their reduced SNR compared to 1.5 or 3 T systems limits their applicability for research and clinical use. In this paper, we describe a deep learning‐based super‐resolution approach to generate high‐resolution isotropic T2‐weighted scans from low‐resolution paediatric input scans. We train a ‘multi‐orientation U‐Net’, which uses multiple low‐resolution anisotropic images acquired in orthogonal orientations to construct a super‐resolved output. Our approach exhibits improved quality of outputs compared to current state‐of‐the‐art methods for super‐resolution of ultra‐low‐field scans in paediatric populations. Crucially for paediatric development, our approach improves reconstruction of deep brain structures with the greatest improvement in volume estimates of the caudate, where our model improves upon the state‐of‐the‐art in: linear correlation (r = 0.94 vs. 0.84 using existing methods), exact agreement (Lin's concordance correlation = 0.94 vs. 0.80) and mean error (0.05 cm3 vs. 0.36 cm3). Our research serves as proof‐of‐principle of the viability of training deep‐learning based super‐resolution models for use in neurodevelopmental research and presents the first model trained exclusively on paired ultra‐low‐field and high‐field data from infants.
Owing to the high cost of modern magnetic resonance imaging (MRI) systems, their use in clinical care and neurodevelopmental research is limited to hospitals and universities in high income countries. Ultra‐low‐field systems with significantly lower scanning costs present a promising avenue towards global MRI accessibility; however, their reduced SNR compared to 1.5 or 3 T systems limits their applicability for research and clinical use. In this paper, we describe a deep learning‐based super‐resolution approach to generate high‐resolution isotropic T2‐weighted scans from low‐resolution paediatric input scans. We train a ‘multi‐orientation U‐Net’, which uses multiple low‐resolution anisotropic images acquired in orthogonal orientations to construct a super‐resolved output. Our approach exhibits improved quality of outputs compared to current state‐of‐the‐art methods for super‐resolution of ultra‐low‐field scans in paediatric populations. Crucially for paediatric development, our approach improves reconstruction of deep brain structures with the greatest improvement in volume estimates of the caudate, where our model improves upon the state‐of‐the‐art in: linear correlation (r = 0.94 vs. 0.84 using existing methods), exact agreement (Lin's concordance correlation = 0.94 vs. 0.80) and mean error (0.05 cm3 vs. 0.36 cm3). Our research serves as proof‐of‐principle of the viability of training deep‐learning based super‐resolution models for use in neurodevelopmental research and presents the first model trained exclusively on paired ultra‐low‐field and high‐field data from infants.
No abstract
Brain magnetic resonance imaging (MRI) is essential for diagnosis and neurodevelopmental research, but the high cost and infrastructure demands of high-field MRI scanners restrict their use to high-income settings. To address this, more affordable and energy-efficient ultra-low-field MRI scanners have been developed. However, the reduced resolution and signal-to-noise ratio of the resulting scans limit their clinical utility, motivating the development of super-resolution techniques. The current state-of-the-art super-resolution methods require either three anisotropic ultra-low-field scans acquired at different orientations (axial, coronal, sagittal) to reconstruct a higher-resolution image using multi-resolution registration (MRR), or the training of deep learning super-resolution models using paired ultra-low- and high-field scans. Since acquiring three high-quality ultra-low-field scans is not always feasible, and paired high-field data may not be available for the target population, this study explores the efficacy of using a deep learning model, the 3D UNet, to generate higher-resolution brain scans from just one ultra-low-field scan. The model was trained to receive a single ultra-low-field brain scan of 6-month-old infants and produce a scan of MRR quality. Results showed a significant improvement in the quality of output scans compared to input scans, including increased image quality metrics, stronger correlations in tissue volume estimates across participants, and greater Dice overlap of the underlying tissue segmentations to those of target scans. The study demonstrates that the 3D UNet effectively enhances the resolution of ultra-low-field infant MRI scans. Generating higher-resolution brain scans from single ultra-low-field scans, without needing paired high-field data, reduces scanning time and supports wider MRI use in low- and middle-income countries. Additionally, this approach allows for easier model training on a site- and population-specific basis, enhancing adaptability in diverse settings.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.