Summary During realistic, continuous perception, humans automatically segment experiences into discrete events. Using a novel model of cortical event dynamics, we investigate how cortical structures generate event representations during narrative perception, and how these events are stored to and retrieved from memory. Our data-driven approach allows us to detect event boundaries as shifts between stable patterns of brain activity without relying on stimulus annotations, and reveals a nested hierarchy from short events in sensory regions to long events in high-order areas (including angular gyrus and posterior medial cortex), which represent abstract, multimodal situation models. High-order event boundaries are coupled to increases in hippocampal activity, which predict pattern reinstatement during later free recall. These areas also show evidence of anticipatory reinstatement as subjects listen to a familiar narrative. Based on these results, we propose that brain activity is naturally structured into nested events, which form the basis of long-term memory representations.
During realistic, continuous perception, humans automatically segment experiences into 6 discrete events. Using a novel model of neural event dynamics, we investigate how cortical structures 7 generate event representations during continuous narratives, and how these events are stored and 8 retrieved from long-term memory. Our data-driven approach enables identification of event boundaries 9 and event correspondences across datasets without human-generated stimulus annotations, and 10 reveals that different regions segment narratives at different timescales. We also provide the first direct 11 evidence that narrative event boundaries in high-order areas (overlapping the default mode network) 12 trigger encoding processes in the hippocampus, and that this encoding activity predicts pattern 13 reinstatement during recall. Finally, we demonstrate that these areas represent abstract, multimodal 14 situation models, and show anticipatory event reinstatement as subjects listen to a familiar narrative. 15Our results provide strong evidence that brain activity is naturally structured into semantically 16 meaningful events, which are stored in and retrieved from long-term memory. 17 18 Note that previous analyses of this dataset have shown that the evoked activity is similar across 129 subjects, justifying an across-subjects design . We found that essentially all 130 brain regions that responded consistently to the movie (across subjects) showed evidence for event-like 131 structure, and that the optimal number of events varied across the cortex (Fig. 2). Sensory regions like 132 visual cortex showed faster transitions between stable activity patterns, while higher-level regions like 133 the precuneus had activity patterns that often remained constant for over a minute before transitioning 134 to a new stable pattern (see Fig. 2 insets). This topography of event timescales is broadly consistent with 135 that found in previous work measuring sensitivity to temporal scrambling of a 136 movie stimulus (see Supp. Fig. 3). 137
Humans are able to mentally construct an episode when listening to another person's recollection, even though they themselves did not experience the events. However, it is unknown how strongly the neural patterns elicited by mental construction resemble those found in the brain of the individual who experienced the original events. Using fMRI and a verbal communication task, we traced how neural patterns associated with viewing specific scenes in a movie are encoded, recalled, and then transferred to a group of naïve listeners. By comparing neural patterns across the 3 conditions, we report, for the first time, that event-specific neural patterns observed in the default mode network are shared across the encoding, recall, and construction of the same real-life episode. This study uncovers the intimate correspondences between memory encoding and event construction, and highlights the essential role our common language plays in the process of transmitting one's memories to other brains.
13Humans are able to mentally construct an episode when listening to another person's 14 recollection, even though they themselves did not experience the events. However, it is unknown
The “Narratives” collection aggregates a variety of functional MRI datasets collected while human subjects listened to naturalistic spoken stories. The current release includes 345 subjects, 891 functional scans, and 27 diverse stories of varying duration totaling ~4.6 hours of unique stimuli (~43,000 words). This data collection is well-suited for naturalistic neuroimaging analysis, and is intended to serve as a benchmark for models of language and narrative comprehension. We provide standardized MRI data accompanied by rich metadata, preprocessed versions of the data ready for immediate use, and the spoken story stimuli with time-stamped phoneme- and word-level transcripts. All code and data are publicly available with full provenance in keeping with current best practices in transparent and reproducible neuroimaging.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.