The “Narratives” collection aggregates a variety of functional MRI datasets collected while human subjects listened to naturalistic spoken stories. The current release includes 345 subjects, 891 functional scans, and 27 diverse stories of varying duration totaling ~4.6 hours of unique stimuli (~43,000 words). This data collection is well-suited for naturalistic neuroimaging analysis, and is intended to serve as a benchmark for models of language and narrative comprehension. We provide standardized MRI data accompanied by rich metadata, preprocessed versions of the data ready for immediate use, and the spoken story stimuli with time-stamped phoneme- and word-level transcripts. All code and data are publicly available with full provenance in keeping with current best practices in transparent and reproducible neuroimaging.
Retrieval of learning-related neural activity patterns is thought to drive memory stabilization. However, finding reliable, noninvasive, content-specific indicators of memory retrieval remains a central challenge. Here, we attempted to decode the content of retrieved memories in the EEG during sleep. During encoding, male and female human subjects learned to associate spatial locations of visual objects with left-or right-hand movements, and each object was accompanied by an inherently related sound. During subsequent slow-wave sleep within an afternoon nap, we presented half of the sound cues that were associated (during wake) with left-and right-hand movements before bringing subjects back for a final postnap test. We trained a classifier on sleep EEG data (focusing on lateralized EEG features that discriminated left-vs right-sided trials during wake) to predict learning content when we cued the memories during sleep. Discrimination performance was significantly above chance and predicted subsequent memory, supporting the idea that retrieval leads to memory stabilization. Moreover, these lateralized signals increased with postcue sleep spindle power, demonstrating that retrieval has a strong relationship with spindles. These results show that lateralized activity related to individual memories can be decoded from sleep EEG, providing an effective indicator of offline retrieval.
Competition between memories can cause weakening of those memories. Here we investigated memory competition during sleep in human participants by presenting auditory cues that had been linked to two distinct picture-location pairs during wake. We manipulated competition during learning by requiring participants to rehearse picture-location pairs associated with the same sound either competitively (choosing to rehearse one over the other, leading to greater competition) or separately; we hypothesized that greater competition during learning would lead to greater competition when memories were cued during sleep. With separate-pair learning, we found that cueing benefited spatial retention. With competitive-pair learning, no benefit of cueing was observed on retention, but cueing impaired retention of well-learned pairs (where we expected strong competition). During sleep, post-cue beta power (16-30 Hz) indexed competition and predicted forgetting, whereas sigma power (11-16 Hz) predicted subsequent retention. Taken together, these findings show that competition between memories during learning can modulate how they are consolidated during sleep.
The “Narratives” collection aggregates a variety of functional MRI datasets collected while human subjects listened to naturalistic spoken stories. The current release includes 345 subjects, 891 functional scans, and 27 diverse stories of varying duration totaling ~4.6 hours of unique stimuli (~43,000 words). This data collection is well-suited for naturalistic neuroimaging analysis, and is intended to serve as a benchmark for models of language and narrative comprehension. We provide standardized MRI data accompanied by rich metadata, preprocessed versions of the data ready for immediate use, and the spoken story stimuli with time-stamped phoneme- and word-level transcripts. All code and data are publicly available with full provenance in keeping with current best practices in transparent and reproducible neuroimaging.
Functional magnetic resonance imaging (fMRI) offers a rich source of data for studying the neural basis of cognition. Here, we describe the Brain Imaging Analysis Kit (BrainIAK), an open-source, free Python package that provides computationally optimized solutions to key problems in advanced fMRI analysis. A variety of techniques are presently included in BrainIAK: intersubject correlation (ISC) and intersubject functional connectivity (ISFC), functional alignment via the shared response model (SRM), full correlation matrix analysis (FCMA), a Bayesian version of representational similarity analysis (BRSA), event segmentation using hidden Markov models, topographic factor analysis (TFA), inverted encoding models (IEMs), an fMRI data simulator that uses noise characteristics from real data (fmrisim), and some emerging methods. These techniques have been optimized to leverage the efficiencies of high-performance compute (HPC) clusters, and the same code can be seamlessly transferred from a laptop to a cluster. For each of the aforementioned techniques, we describe the data analysis problem that the technique is meant to solve and how it solves that problem; we also include an example Jupyter notebook for each technique and an annotated bibliography of papers that have used and/or described that technique. In addition to the sections describing various analysis techniques in BrainIAK, we have included sections describing the future applications of BrainIAK to real-time fMRI, tutorials that we have developed and shared online to facilitate learning the techniques in BrainIAK, computational innovations in BrainIAK, and how to contribute to BrainIAK. We hope that this manuscript helps readers to understand how BrainIAK might be useful in their research.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.