In everyday life the stream of affect results from the interaction between past experiences, expectations, and the unfolding of events. How the brain represents the relationship between time and affect has been hardly explored, as it requires modeling the complexity of everyday life in the laboratory setting. Movies condense into hours a multitude of emotional responses, synchronized across subjects and characterized by temporal dynamics alike real-world experiences. Here, we use time-varying intersubject brain synchronization and real-time behavioral reports to test whether connectivity dynamics track changes in affect during movie watching. Results show that polarity and intensity of experiences relate to connectivity of the default mode and control networks and converge in the right temporo-parietal cortex. We validate these results in two experiments including four independent samples, two movies, and alternative analysis workflows. Lastly, we reveal chronotopic connectivity maps within temporo-parietal and prefrontal cortex, where adjacent areas preferentially encode affect at specific timescales.
The processing of multisensory information is based upon the capacity of brain regions, such as the superior temporal cortex, to combine information across modalities. However, it is still unclear whether the representation of coherent auditory and visual events requires any prior audiovisual experience to develop and function. Here we measured brain synchronization during the presentation of an audiovisual, audio-only or video-only version of the same narrative in distinct groups of sensory-deprived (congenitally blind and deaf) and typically developed individuals. Intersubject correlation analysis revealed that the superior temporal cortex was synchronized across auditory and visual conditions, even in sensory-deprived individuals who lack any audiovisual experience. This synchronization was primarily mediated by low-level perceptual features, and relied on a similar modality-independent topographical organization of slow temporal dynamics. The human superior temporal cortex is naturally endowed with a functional scaffolding to yield a common representation across multisensory events.
The stream of affect is the result of a constant interaction between past experiences, motivations, expectations and the unfolding of events. How the brain represents the relationship between time and affect has been hardly explored, as it requires modeling the complexity of everyday life in the laboratory. Movies condense into hours a multitude of emotional responses, synchronized across subjects and characterized by temporal dynamics alike real-world experiences.Here, using naturalistic stimulation, time-varying intersubject brain connectivity and behavioral reports, we demonstrate that connectivity strength of large-scale brain networks tracks changes in affect. The default mode network represents the pleasantness of the experience, whereas attention and control networks encode its intensity. Interestingly, these orthogonal descriptions of affect converge in right temporoparietal and fronto-polar cortex. Within these regions, the stream of affect is represented at multiple timescales by chronotopic maps, where connectivity of adjacent areas preferentially maps experiences in 3-to 11-minute segments.--- Figure 3 about here ---
The processing of multisensory information is based upon the capacity of brain regions, such as the superior temporal cortex, to combine information across modalities. However, it is still unclear whether the representation of coherent auditory and visual events does require any prior audiovisual experience to develop and function. In three fMRI experiments, intersubject correlation analysis measured brain synchronization during the presentation of an audiovisual, audio-only or video-only versions of the same narrative in distinct groups of sensory-deprived (congenitally blind and deaf) and typically-developed individuals. The superior temporal cortex synchronized across auditory and visual conditions, even in sensory-deprived individuals who lack any audiovisual experience. This synchronization was primarily mediated by low-level perceptual features and relied on a similar modality-independent topographical organization of temporal dynamics. The human superior temporal cortex is naturally endowed with a functional scaffolding to yield a common representation across multisensory events.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.