A multivariate sample entropy metric of signal complexity is applied to EEG data recorded when subjects were viewing four prior-labeled emotion-inducing video clips from a publically available, validated database. Besides emotion category labels, the video clips also came with arousal scores. Our subjects were also asked to provide their own emotion labels. In total 30 subjects with age range 19-70 years participated in our study. Rather than relying on predefined frequency bands, we estimate multivariate sample entropy over multiple data-driven scales using the multivariate empirical mode decomposition (MEMD) technique and show that in this way we can discriminate between five self-reported emotions (p < 0.05). These results could not be obtained by analyzing the relation between arousal scores and video clips, signal complexity and arousal scores, and self-reported emotions and traditional power spectral densities and their hemispheric asymmetries in the theta, alpha, beta, and gamma frequency bands. This shows that multivariate, multiscale sample entropy is a promising technique to discriminate multiple emotional states from EEG recordings.
A data-adaptive, multiscale version of Rényi’s quadratic entropy (RQE) is introduced for emotional state discrimination from EEG recordings. The algorithm is applied to scalp EEG recordings of 30 participants watching 4 emotionally-charged video clips taken from a validated public database. Krippendorff’s inter-rater statistic reveals that multiscale RQE of the mid-frontal scalp electrodes best discriminates between five emotional states. Multiscale RQE is also applied to joint scalp EEG, amygdala- and occipital pole intracranial recordings of an implanted patient watching a neutral and an emotionally charged video clip. Unlike for the neutral video clip, the RQEs of the mid-frontal scalp electrodes and the amygdala-implanted electrodes are observed to coincide in the time range where the crux of the emotionally-charged video clip is revealed. In addition, also during this time range, phase synchrony between the amygdala and mid-frontal recordings is maximal, as well as our 30 participants’ inter-rater agreement on the same video clip. A source reconstruction exercise using intracranial recordings supports our assertion that amygdala could contribute to mid-frontal scalp EEG. On the contrary, no such contribution was observed for the occipital pole’s intracranial recordings. Our results suggest that emotional states discriminated from mid-frontal scalp EEG are likely to be mirrored by differences in amygdala activations in particular when recorded in response to emotionally-charged scenes.
Assessing the human affective state using electroencephalography (EEG) have shown good potential but failed to demonstrate reliable performance in real-life applications. Especially if one applies a setup that might impact affective processing and relies on generalized models of affect. Additionally, using subjective assessment of ones affect as ground truth has often been disputed. To shed the light on the former challenge we explored the use of a convenient EEG system with 20 participants to capture their reaction to affective movie clips in a naturalistic setting. Employing state-of-the-art machine learning approach demonstrated that the highest performance is reached when combining linear features, namely symmetry features and single-channel features, with nonlinear ones derived by a multiscale entropy approach. Nevertheless, the best performance, reflected in the highest F1-score achieved in a binary classification task for valence was 0.71 and for arousal 0.62. The performance was 10–20% better compared to using ratings provided by 13 independent raters. We argue that affective self-assessment might be underrated and it is crucial to account for personal differences in both perception and physiological response to affective cues.
Time is as pervasive as it is elusive to study, and how the brain keeps track of millisecond time is still unclear. Here we studied the mechanisms underlying duration perception by looking for a neural signature of subjective time distortion induced by motion adaptation. We recorded electroencephalographic signals in human participants while they were asked to discriminate the duration of visual stimuli after translational motion adaptation. Our results show that distortions of subjective time can be predicted by the amplitude of the N200 event-related potential and by the activity in the Beta band frequency spectrum. Both effects were observed from occipital electrodes contralateral to the adapted stimulus. Finally, a multivariate decoding analysis highlights the impact of motion adaptation throughout the visual stream. Overall, our findings show the crucial involvement of local and low-level perceptual processes in generating a subjective sense of time.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.