Dense functional magnetic resonance imaging datasets open new avenues to create auto-regressive models of brain activity. Individual idiosyncrasies are obscured by group models, but can be captured by purely individual models given sufficient amounts of training data. In this study, we compared several deep and shallow individual models on the auto-regression of BOLD time series recorded during a natural video watching task. The best performing models were then analyzed in terms of their data requirements and scaling, subject specificity and the space-time structure of their predicted dynamics.We found the Chebnets, a type of graph convolutional neural network, to be best suited for BOLD auto-regression, closely followed by linear models. Chebnets demonstrated an increase in performance with increasing amounts of data, with no complete saturation at 9 h of training data. Significant subject specificity was found at short prediction time lags. The Chebnets were found to capture lower frequencies at longer prediction time lags, and the spatial correlations in predicted dynamics were found to match traditional functional connectivity networks.Overall, these results demonstrate that large individual fMRI datasets can be used to efficiently train purely individual auto-regressive models of brain activity, and that massive amounts of individual data are required to do so. The excellent performance of the Chebnets likely reflects their ability to combine spatial and temporal interactions on large time scales at a low complexity cost. Individual auto-regressive models have the potential to improve our understanding of the functional interactions in the human brain. This study is based on a massive, publicly-available dataset, which can serve for future benchmarks of individual auto-regressive modeling.
Playing video games in a neuroimaging environment is both scientifically promising and technically challenging. Primary among these challenges is the need to use scanner-compatible devices to register player inputs, which limits the type of games that can be comfortably played in a scanner and often reduces the ecological validity of video game tasks. In this paper, we introduce an MRI- and MEG-compatible video game controller that is made exclusively of 3D-printed and commercially available parts, and we release the design files and documentations in the goal of making its production accessible to any research team with minimal engineering resources. In line with the open science philosophy, we made this work accessible under an Open Source Hardware license that aims to promote accessibility and reproducibility. Additionally, we validated the responsiveness and scanner-compatibility of our controller by comparing it to a reference, non-MRI compatible controller, and by assessing the quality of the data recorded with and without the use of the said controller. The analysis of response latencies showed reliable button press accuracies. A higher latency was detected on button releases, both for long and short button presses although this effect was small enough as not to affect gameplay in most situations. Analysis of subject motion during fMRI recordings of various tasks showed that the use of our controller didn’t increase the amount of motion produced. We hope that this tool will stimulate further neuroimaging studies of video games tasks by improving both their accessibility and their validity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.