Information about a user's emotional state is a very important aspect of affective interaction with embodied conversational agents. Most research work aims at identifying emotions through speech or facial expressions. However, facial expressions and speech are not continuously available. Furthermore, in some cases, bio-signal data are also required in order to fully assess a user's emotional state. We aimed to recognize the six, basic, primary emotions proposed by Ekman, using a widely-available and low-cost brain-computer interface (BCI) and a biofeedback sensor that measures heart rate. We exposed participants to sets of 10 IAPS images that had been partially validated through a subjective rating protocol. Results showed that the collected signals allowed us identifying user's emotional state. In addition, a partial correlation between objective and subjective data can be observed.
Information on a customer’s emotional states concerning a product or an advertisement is a very important aspect of marketing research. Most studies aimed at identifying emotions through speech or facial expressions. However, these two vary greatly with people’s talking habits, which cause the data lacking continuous availability. Furthermore, bio-signal data is also required in order to fully assess a user’s emotional state in some cases. We focused on recognising the six basic primary emotions proposed by Ekman using biofeedback sensors, which measure heart rate and skin conductance. Participants were shown a series of 12 video-based stimuli that have been validated by a subjective rating protocol. Experiment results showed that the collected signals allow us to identify user's emotional state with a good ratio. In addition, a partial correlation between objective and subjective data has been observed
This paper presents a multi-modal interactive virtual environment (VE) to train for job interview. The proposed platform aims to train candidates (students, job hunters, etc.) to better master their emotional state and behavioral skills. The candidates will interact with a virtual recruiter represented by an Embodied Conversational Agent (ECA). Both emotional and behavior states will be assessed using human-machine interfaces and biofeedback sensors. Contextual questions will be asked by the ECA to measure the technical skills of the candidates. Collected data will be processed in real-time by a behavioral engine to allow a realistic multi-modal dialogue between the ECA and the candidate. This work represents a socio-technological rupture opening the way to new possibilities in different areas such as professional or medical applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.