Magnetic resonance imaging (MRI) generates a radiofrequency field (B1) to frequency encode the object being imaged. Deviations from the nominal B1 field produce artifactual intensity nonuniformity (INU) across the image, which is problematic, especially for automated analyses that assume a tissue is represented by voxels of similar intensity throughout the image (Belaroussi et al. 2006). These artifacts are particularly exacerbated by receiver coil failures. Such events are difficult to capture as they tend to be short-lived and sporadic. In brain blood-oxygen-level-dependent (BOLD) functional MRI (fMRI), B1 field dynamics is usually visualized with a video of the scan to spot signal intensity changes, but this method is time-consuming and error-prone, as the human observer needs to keep focus during the whole video. Here, we showcase a visualization tool to assess B1 field dynamics and a derived summary metric to efficiently detect low spatial-frequency artifacts, such as transient INU.
Despite substantial efforts toward improving the tools to carry out the visual assessment of quality, as well as automation, the quality control (QC) of imaging data remains an onerous, yet critical step of analysis workflows, especially within large-scale studies. Indeed, the reliability and reproducibility of results can be improved by implementing QC checkpoints throughout the workflow (Niso et al. 2022, Provins et al. 2023). Here, we introduce Q’kay, a web service to deploy rigorous QC protocols on large datasets leveraging the individual reports generated by tools like MRIQC (Esteban et al., 2017) and fMRIPrep (Esteban et al., 2019).
MRIQC (Esteban et al. 2017) is a tool to help researchers perform quality control (QC) on their structural and functional MRI data. Not only does MRIQC generate visual reports for reliable, manual assessment but it also automatically extracts a set of image quality metrics (IQMs). However, these IQMs are hard to interpret, and many related questions remain open, such as which IQMs are more important. In this project, which emerged as a BrainHack Geneva 2022 initiative, we show that head motion during the acquisition of whole-brain T1-weighted (T1w) MRI of healthy volunteers can be predicted based on the IQMs using supervised machine learning. To do so, we employ the open MR-ART (movement-related artifacts; Nárai et al. 2022) dataset, which includes T1w images acquired under three different motion conditions. We show that signal-to-noise ratio (SNR) derived metrics are the most important features to predict motion presence.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.