We discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI) and user modeling. We introduce the overall paradigm for our multimodal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature) and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement). We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and potential applications of emotion recognition for multimodal intelligent systems
We discuss our approach to developing a novel modality for the computer-delivery of Brief Motivational Interventions (BMIs) for behavior change in the form of a personalized On-Demand VIrtual Counselor (ODVIC), accessed over the internet. ODVIC is a multimodal Embodied Conversational Agent (ECA) that empathically delivers an evidence-based behavior change intervention by adapting, in real-time, its verbal and nonverbal communication messages to those of the user's during their interaction. We currently focus our work on excessive alcohol consumption as a target behavior, and our approach is adaptable to other target behaviors (e.g., overeating, lack of exercise, narcotic drug use, non-adherence to treatment). We based our current approach on a successful existing patient-centered brief motivational intervention for behavior change-the Drinker's Check-Up (DCU)-whose computer-delivery with a text-only interface has been found effective in reducing alcohol consumption in problem drinkers. We discuss the results of users' evaluation of the computer-based DCU intervention delivered with a text-only interface compared to the same intervention delivered with two different ECAs (a neutral one and one with some empathic abilities). Users rate the three systems in terms of acceptance, perceived enjoyment, and intention to use the system, among other dimensions. We conclude with a discussion of how our positive results encourage our long-term goals of on-demand conversations, anytime, anywhere, with virtual agents as personal health and well-being helpers.
This study provides evidence supporting the use of avatars and virtual agents, designed using participatory approaches, to be included in the continuum of care. Increased probability of engagement and long-term retention of overweight, obese adolescent users and suggests expanding current chronic care models toward more comprehensive, socio-technical representations.
Previous experiences show that it is possible for agents such as robots cooperating asynchronously on a sequential task to enter deadlock, where one robot does not fulfill its obligations in a timely manner due to hardware or planning failure, unanticipated delays, etc. Our approach uses a formal multilevel hierarchy of emotions where emotions both modify active behaviors at the sensory-motor level and change the set of active behaviors at the schematic level. The resulting implementation of a team of heterogeneous robots using a hybrid deliberative/reactive architecture produced the desired emergent societal behavior. Data collected at two different public venues illustrate how a dependent agent selects new behaviors (e.g., stop serving, move to intercept the refiller) to compensate for delays from a subordinate agent (e.g., blocked by the audience). The subordinate also modifies the intensity of its active behaviors in response to feedback from the dependent agent. The agents communicate asynchronously through Knowledge Query and Manipulation Language via wireless Ethernet.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.