We describe the creation of an affective film dataset for researchers interested in studying a broad spectrum of emotional experiences. Two hundred twenty-two 60-seconds long video clips were selected based on multimedia content analysis and screened in the lab with 407 participants. The participants' ratings mapped to 31 emotion categories in the first stage. Based on the selection criteria, 69 audio-visual clips were selected. These selected affective clips were then presented to 271 participants. Participants rated these clips on rating scales and categorized them into emotion categories. The affective clips were able to induce 19 basic and complex emotion categories reliably. Since the presented dataset is comprised of film clips based on both Indian and western content, the dataset can effectively be used for cross-cultural emotion research. From the dataset, researchers can select emotional movie clips based on the ratings and quantitative measures, including the reliability measures presented in this work. We also show a continuity of emotional experiences using an advanced visualisation technique to complement the existing knowledge based on V-A space with the information on how the transitions among emotion categories are taking place.
Emotion is a constructed phenomenon that emerges from the dynamic interaction of multiple components neurologically, physiologically and behaviorally. Such dynamics can not be captured by static and controlled experiments. Hence, the study of emotion with a naturalistic paradigm is needed. In this dataset, multimedia naturalistic stimuli are used to acquire the emotional dynamics using EEG, ECG, EMG and behavioural scales. The stimuli are multimedia videos collected from youtube for 372 affective words, analyzed with multimedia content analysis to filter out non-emotional stimuli and then validated with university students. The validated stimuli had the least variance in subjective ratings on self-assessment scales. The stimuli are then used to acquire neurological dynamics along with peripheral channels and subjective ratings-valence, arousal, dominance, liking, familiarity, relevance and emotion category. Both the raw data and pre-processed data is provided along with the pre-processing pipeline. This data can be utilized to study dynamic activation and connectivity in the whole brain source localization study, understand the mutual interaction between the central and autonomic nervous system, understand temporal hierarchy using multiresolution tools, and perform machine learning-based classification and complex networks analysis. The data is accessible at \url{10.18112/openneuro.ds003751.v1.0.0}
While naturalistic stimuli, such as movies, better represent the complexity of the real world and are perhaps crucial to understanding the dynamics of emotion processing, there is limited research on emotions with naturalistic stimuli. There is a need to understand the temporal dynamics of emotion processing and their relationship to different dimensions of emotion experience. In addition, there is a need to understand the dynamics of functional connectivity underlying different emotional experiences that occur during or prior to such experiences. To address these questions, we recorded the EEG of participants and asked them to mark the temporal location of their emotional experience as they watched a video. We also obtained self-assessment ratings for emotional multimedia stimuli. We calculated dynamic functional the connectivity (DFC) patterns in all the frequency bands, including information about hubs in the network. The change in functional networks was quantified in terms of temporal variability, which was then used in regression analysis to evaluate whether temporal variability in DFC (tvDFC) could predict different dimensions of emotional experience. We observed that the connectivity patterns in the upper beta band could differentiate emotion categories better during or prior to the reported emotional experience. The temporal variability in functional connectivity dynamics is primarily related to emotional arousal followed by dominance. The hubs in the functional networks were found across the right frontal and bilateral parietal lobes, which have been reported to facilitate affect, interoception, action, and memory-related processing. Since our study was performed with naturalistic real-life resembling emotional videos, the study contributes significantly to understanding the dynamics of emotion processing. The results support constructivist theories of emotional experience and show that changes in dynamic functional connectivity can predict aspects of our emotional experience.
Our brain continuously interacts with the body as we engage with the world. Although we are mostly unaware of internal bodily processes, such as our heartbeats, they may be influenced by and in turn influence our perception and emotional feelings. Although there is a recent focus on understanding cardiac interoceptive activity and interaction with brain activity during emotion processing, the investigation of cardiac–brain interactions with more ecologically valid naturalistic emotional stimuli is still very limited. We also do not understand how an essential aspect of emotions, such as context familiarity, influences affective feelings and is linked to statistical interaction between cardiac and brain activity. Hence, to answer these questions, we designed an exploratory study by recording ECG and EEG signals for the emotional events while participants were watching emotional movie clips. Participants also rated their familiarity with the stimulus on the familiarity scale. Linear mixed effect modelling was performed in which the ECG power and familiarity were considered as predictors of EEG power. We focused on three brain regions, including prefrontal (PF), frontocentral (FC) and parietooccipital (PO). The analyses showed that the interaction between the power of cardiac activity in the mid-frequency range and the power in specific EEG bands is dependent on familiarity, such that the interaction is stronger with high familiarity. In addition, the results indicate that arousal is predicted by cardiac–brain interaction, which also depends on familiarity. The results support emotional theories that emphasize context dependency and interoception. Multimodal studies with more realistic stimuli would further enable us to understand and predict different aspects of emotional experience.
Availability of naturalistic affective stimuli is needed for creating the affec- tive technological solution as well as making progress in affective science. Although a lot of progress in the collection of affective multimedia stimuli has been made in western countries, the technology and findings based on such monocultural datasets may not be scalable to other cultures. Moreover, the available dataset on affective multimedia content has some experimenter bias in the initial manual selection of affective multimedia content. Hence, in this work, we mainly tried to address two problems. The first problem relates to the experimenter’s subjective bias, and the second relates to the non-availability of affective multimedia dataset validated on Indian population. We tried to address both problems by reducing the experimenter’s bias as much as possible. We adopted the data science and multimedia content analysis techniques to perform our initial collection and a further selection of stimuli. Our method resulted in a dataset with a wide variety in content, stimuli from Western and Indian cinema, and symmetric presence of stimuli along valence, arousal and dominance dimensions. We conclude that using our method, more cross-cultural affective stimuli datasets can be created, which is essential to make progress in affective technology and science.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.