2019
DOI: 10.1098/rstb.2018.0026
|View full text |Cite
|
Sign up to set email alerts
|

Live human–robot interactive public demonstrations with automatic emotion and personality prediction

Abstract: Communication with humans is a multi-faceted phenomenon where the emotions, personality and non-verbal behaviours, as well as the verbal behaviours, play a significant role, and human–robot interaction (HRI) technologies should respect this complexity to achieve efficient and seamless communication. In this paper, we describe the design and execution of five public demonstrations made with two HRI systems that aimed at automatically sensing and analysing human participants’ non-verbal behaviour and predicting … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 22 publications
(23 citation statements)
references
References 37 publications
0
23
0
Order By: Relevance
“…Indeed, social robots need to be able to sense signals from humans, respond adequately, understand and generate natural language, have reasoning capacities, plan actions and execute movements in line with what is required by the specific context or situation. In this part of our theme issue, three contributions cover the areas of research related to technical solutions for HRI: computational architectures [9], classification and prediction of human behaviour and expressions [10], and natural language processing [11]. These three contributions provide examples of challenges that roboticists and artificial intelligence experts need to face in order to design robots endowed with capabilities crucial for social interactions with humans.…”
Section: Technical Solutions For Human -Robot Interactionmentioning
confidence: 99%
See 1 more Smart Citation
“…Indeed, social robots need to be able to sense signals from humans, respond adequately, understand and generate natural language, have reasoning capacities, plan actions and execute movements in line with what is required by the specific context or situation. In this part of our theme issue, three contributions cover the areas of research related to technical solutions for HRI: computational architectures [9], classification and prediction of human behaviour and expressions [10], and natural language processing [11]. These three contributions provide examples of challenges that roboticists and artificial intelligence experts need to face in order to design robots endowed with capabilities crucial for social interactions with humans.…”
Section: Technical Solutions For Human -Robot Interactionmentioning
confidence: 99%
“…In line with the authors' argumentation, this approach shows that when computational models are implemented in an embodied agent, one can test whether the models correctly capture cognitive mechanisms through examining generated behaviour, and interaction with the environment. This paper is followed by a contribution by Gunes et al [10] which is focused on the capability of artificial systems for online, real-time automatic prediction of personality of human users, based on their behavioural cues and emotional facial expressions. The authors report a method for users' personality prediction based on features such as facial appearance, geometric facial and body features, as well as temporal relations between the extracted features and continuous annotations.…”
Section: Technical Solutions For Human -Robot Interactionmentioning
confidence: 99%
“…has led to an increased interest in integrating emotions into consumer and sensory research. Several articles (11 studies) in this review utilized FACS for the purpose of product and/or software development, however, the majority (six studies) were focused on developing and/or validating software or consumerbased products that contained software for automatic recognition of facial expressions of affect; only three studies were truly using FACS to assess consumer affective response to productbased stimuli that were being developed including self-service checkouts and robots (Martin et al, 2013;Tussyadiah and Park, 2018;Gunes et al, 2019). In recent years, there has been a particular focus on the relationship between food and emotions for the sake of understanding food-evoked affect on acceptability, intention to purchase, food choice, attitudes, or behavior, which has led to the introduction of many methods and measures to capture consumers' emotions elicited by food (Lagast et al, 2017;Kaneko et al, 2018).…”
Section: Researchers' Proclivity For Emotionally-competent Stimulimentioning
confidence: 99%
“…Based on this review, the automatic systems used for analysis were trained on a variety of databases including those that contained still images of posed facial expressions (e.g., Pictures of Facial Affect, POFA; Psychological Image Collection at Stirling University, PICS; MMI-Facial Expression Database; Cohn-Kanade DFAT-504) as well as those that contained videos depicting spontaneous facial behavior (e.g., Cohn-Kanade AU-Coded Expression Database, CK+; Rutgers and University of California San Diego FACS database, RU-FACS; Binghamton-Pittsburgh 3D Dynamic Spontaneous Facial Expression Database, BP4D). Although most automatic systems were trained on a spontaneous expression database, several did not identify how their system was trained (D'Mello and Graesser, 2010;Hung et al, 2017;Gurbuz and Toga, 2018;Tussyadiah and Park, 2018) or were solely trained on posed still image databases (Brown et al, 2014;Gunes et al, 2019), which may put their interpretations of participant emotion responses to consumer product-based stimuli at risk. Spontaneous facial expressions differ substantially from posed expressions; subjects often contract different facial muscles when asked to pose an emotion such as fear (subjects perform AUs 1+2), vs. when they are actually experiencing fear (spontaneous fear reliably elicits AUs 1+2+4) (Ekman, 2009).…”
Section: Procedures For Utilization Of Facsmentioning
confidence: 99%
“…"Effect of the Robot Behavioral Multimodality on Interaction"). Additionally, we discuss another evaluation for the generated affective behavior of the robot based on the behavioral determinant factors of the participants: personality extraversion [33] and gender.…”
Section: Introductionmentioning
confidence: 99%