In this paper, we present DREAMER, a multimodal database consisting of electroencephalogram (EEG) and electrocardiogram (ECG) signals recorded during affect elicitation by means of audio-visual stimuli. Signals from 23 participants were recorded along with the participants self-assessment of their affective state after each stimuli, in terms of valence, arousal, and dominance. All the signals were captured using portable, wearable, wireless, low-cost, and off-the-shelf equipment that has the potential to allow the use of affective computing methods in everyday applications. A baseline for participant-wise affect recognition using EEG and ECG-based features, as well as their fusion, was established through supervised classification experiments using support vector machines (SVMs). The self-assessment of the participants was evaluated through comparison with the self-assessments from another study using the same audio-visual stimuli. Classification results for valence, arousal, and dominance of the proposed database are comparable to the ones achieved for other databases that use nonportable, expensive, medical grade devices. These results indicate the prospects of using low-cost devices for affect recognition applications. The proposed database will be made publicly available in order to allow researchers to achieve a more thorough evaluation of the suitability of these capturing devices for affect recognition applications.
For some time, equine-assisted therapy (EAT), i.e., the use of horse-related activities for therapeutic reasons, has been recognised as a useful approach in the treatment of many mental health issues such as post-traumatic stress disorder (PTSD), depression, and anxiety. However, despite the interest in EAT, few scientific studies have focused on understanding the complex emotional response that horses seem to elicit in human riders and handlers. In this work, the potential use of affect recognition techniques based on physiological signals is examined for the task of assessing the interaction between humans and horses in terms of the emotional response of the humans to this interaction. Electroencephalography (EEG), electrocardiography (ECG), and electromyography (EMG) signals were captured from humans interacting with horses, and machine learning techniques were applied in order to predict the self-reported emotional states of the human subjects in terms of valence and arousal. Supervised classification experiments demonstrated the potential of this approach for affect recognition during human-horse interaction, reaching an F1-score of 78.27% for valence and 65.49% for arousal.
Biometric signals have been extensively used for user identification and authentication due to their inherent characteristics that are unique to each person. The variation exhibited between the brain signals (EEG) of different people makes such signals especially suitable for biometric user identification. However, the characteristics of these signals are also influenced by the user's current condition, including his/her affective state. In this paper, we analyze the significance of the affect-related component of brain signals within the subject identification context. Consistent results are obtained across three different public datasets, suggesting that the dominant component of the signal is subject-related, but the affective state also has a contribution that affects identification accuracy. Results show that identification accuracy increases when the system has been trained with EEG recordings that refer to similar affective states as the sample that is to be identified. This improvement holds independently of the features and classification algorithm used, and it is generally above 10% under a rigorous setting, when the training and validation datasets do not share data from the same recording days. This finding emphasizes the potential benefits of considering affective information in applications that require subject identification, such as user authentication.
ernuEqonz¡ lezD lo nd utsiginnisD tmos nd erevlilloErerr¡ ezD wiguel nd mznD xeem @PHPIA 9fihX e new dtset for iiqEsed iometrisF9D siii snternet of things journlF F
Web 2.0 helped user-generated platforms to spread widely. Unfortunately, it also allowed for cyberbullying to spread. Cyberbullying has negative effects that could lead to cases of depression and low self-esteem. It has become crucial to develop tools for automated cyberbullying detection. The research on developing these tools has been growing over the last decade, especially with the recent advances in machine learning and natural language processing. Given the large body of work on this topic, it is vital to critically review the literature on cyberbullying within the context of these latest advances. In this paper, we conduct a survey on the automated detection of cyberbullying. Our survey sheds light on some challenges and limitations for the field. The challenges range from defining cyberbullying, data collection and feature representation to model selection, training and evaluation. We also provide some suggestions for improving the task of cyberbullying detection. In addition to the survey, we propose to improve the task of cyberbullying detection by addressing some of the raised limitations: 1) Using recent contextual language models like BERT for the detection of cyberbullying; 2) Using slang-based word embeddings to generate better representations of the cyberbullying-related datasets. Our results show that BERT outperforms state-of-the-art cyberbullying detection models and deep learning models. The results also show that deep learning models initialised with slang-based word embeddings outperform deep learning models initialised with traditional word embeddings.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.