Information about a user's emotional state is a very important aspect of affective interaction with embodied conversational agents. Most research work aims at identifying emotions through speech or facial expressions. However, facial expressions and speech are not continuously available. Furthermore, in some cases, bio-signal data are also required in order to fully assess a user's emotional state. We aimed to recognize the six, basic, primary emotions proposed by Ekman, using a widely-available and low-cost brain-computer interface (BCI) and a biofeedback sensor that measures heart rate. We exposed participants to sets of 10 IAPS images that had been partially validated through a subjective rating protocol. Results showed that the collected signals allowed us identifying user's emotional state. In addition, a partial correlation between objective and subjective data can be observed.
This paper presents a multi-modal interactive virtual environment (VE) to train for job interview. The proposed platform aims to train candidates (students, job hunters, etc.) to better master their emotional state and behavioral skills. The candidates will interact with a virtual recruiter represented by an Embodied Conversational Agent (ECA). Both emotional and behavior states will be assessed using human-machine interfaces and biofeedback sensors. Contextual questions will be asked by the ECA to measure the technical skills of the candidates. Collected data will be processed in real-time by a behavioral engine to allow a realistic multi-modal dialogue between the ECA and the candidate. This work represents a socio-technological rupture opening the way to new possibilities in different areas such as professional or medical applications.
We present a multi-modal affective virtual environment (VE) for job interview training. The proposed platform aims to support real-time emotion-based simulations between an ECA and a human. The first goal is to train candidates (students, job hunters, etc.) to better master their emotional states and behavioral skills. The users' emotional and behavior states will be assessed using different human-machine interfaces and biofeedback sensors. Collected data will be processed in real-time by a behavioral engine. A preliminary experiment was carried out to analyze the correspondence between the users' perceived emotional states and the collected data. Participants were instructed to look at a series of sixty IAPS pictures and rate each picture on the following dimensions : joy, anger, surprise, disgust, fear and sadness.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.