In this study, we investigate the brain networks during positive and negative emotions for different types of stimulus (audio only, video only and audio + video) in [Formula: see text], and [Formula: see text] bands in terms of phase locking value, a nonlinear method to study functional connectivity. Results show notable hemispheric lateralization as phase synchronization values between channels are significant and high in right hemisphere for all emotions. Left frontal electrodes are also found to have control over emotion in terms of functional connectivity. Besides significant inter-hemisphere phase locking values are observed between left and right frontal regions, specifically between left anterior frontal and right mid-frontal, inferior-frontal and anterior frontal regions; and also between left and right mid frontal regions. ANOVA analysis for stimulus types show that stimulus types are not separable for emotions having high valence. PLV values are significantly different only for negative emotions or neutral emotions between audio only/video only and audio only/audio + video stimuli. Finding no significant difference between video only and audio + video stimuli is interesting and might be interpreted as that video content is the most effective part of a stimulus.
This study contributes to our understanding of the Metaverse by presenting a case study of the implementation of brain-computer interface supported game-based engagement in a Virtual Environment (VE). In VE, individuals can communicate with anyone, anywhere, anytime, without any limits. This situation will increase the barrier-free living standards of disabled people in a more accessible environment. A virtual world of well-being awaits these individuals, primarily through gamified applications thanks to Brain-Computer Interfaces. Virtual environments in the Metaverse can be infinitely large, but the user's movement in a virtual reality (VR) environment is constrained by the natural environment. Locomotion has become a popular motion interface as it allows for full exploration of VE. In this study, the teleport method from locomotion methods was used. To teleport, the user selects the intended location using brain signals before being instantly transported to that location. Brain signals are decomposed into alpha, beta, and gamma bands. The features of each band signal in Time, frequency, and time-frequency domains are extracted. In this proposed method, the highest performance of binary classification was obtained in the frequency domain and the Alpha band. Signals in the alpha band were tested in the domains Time, Frequency, and Time-Frequency. Teleport operations are faster with Time and more stable with the frequency domain. However, the Hilbert-Huang Transform (HHT) method used in the Time-Frequency domain could not respond adequately to real-time applications. All these analyses were experienced in the Erzurum Virtual Tour case study, which was prepared to promote cultural heritage with the gamification method.
Automatic detection for human-machine interfaces of the emotional states of the people is one of the difficult tasks. EEG signals that are very difficult to control by the person are also used in emotion recognition tasks. In this study, emotion analysis and classification study were conducted by using EEG signals for different types of stimuli. The combination of the audio and video information has been shown to be more effective about the classification of positive/negative (high/low) emotion by using wavelet transform from EEG signals, and true positive rate of 81.6% was obtained in valence dimension. Information of audio was found to be more effective than the information of video at classification that is made in arousal dimension, and true positive rate of 73.7% was obtained when both stimuli of audio and audio+video are used. Four class classification performance has also been examined in the space of valence-arousal.
Özetçe-Elektroensefalogram işaretlerinden duygu tanıma, Beyin Bilgisayar Arayüzü geliştirilmesinde önemli bir role sahiptir. Bu çalışmada EEG toplanmasında kullanılan ses ve görüntü uyaranlarının duygu sınıflandırma başarıları üzerine etkileri incelenmiştir. Bu amaçla 25 katılımcıdan toplanan EEG verilerinden değerlik ve aktivasyon duygu boyutları için düşük/yüksek olmak üzere ikili sınıflandırma yapılmıştır. EEG işaretlerinden öznitelik çıkarımında dalgacık dönüşümü ve sınıflandırma için 3 farklı sınıflandırıcı kullanılmıştır. Değerlik ve aktivasyon duygu boyutlarında, ses uyaranı için sırasıyla %71.7 ve %78.5 ve görüntü uyaranı için sırasıyla %71 ve %82 doğru pozitif oranları elde edilmiştir. Anahtar Kelimeler-EEG, Aktivasyon, Değerlik, Duygu Boyutu Sınıflandırma.Abstract-Emotion recognition from EEG signals has an important role in designing Brain-Computer Interface. This paper compares effects of audio and visual stimuli, used for collecting emotional EEG signals, on emotion classification performance. For this purpose EEG data from 25 subjects are collected and binary classification (low/high) for valence and activation emotion dimensions are performed. Wavelet transform is used for feature extraction and 3 classifiers are used for classification. True positive rates of 71.7% and 78.5% are obtained using audio and video stimuli for valence dimension 71% and 82% are obtained using audio and video stimuli for arousal dimension, respectively.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.