Proceedings of the 19th ACM International Conference on Multimodal Interaction 2017
DOI: 10.1145/3136755.3136785
|View full text |Cite
|
Sign up to set email alerts
|

Head and shoulders: automatic error detection in human-robot interaction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 22 publications
(13 citation statements)
references
References 10 publications
0
13
0
Order By: Relevance
“…Research in HRI has also investigated how robot failures impact user behaviours, including patterns in eye-gaze, head movements, and speech -social signals that exhibit either established grounding sequences or implicit behavioural responses to failures [6,31,35,76,80]. Behavioural signals have also been examined at unexpected responses from human-robot interactions in the wild [5,30,75], with the use of social signals from low-level sensor input, to highlevel features that represent affect, attention and engagement.…”
Section: Robot Failuresmentioning
confidence: 99%
“…Research in HRI has also investigated how robot failures impact user behaviours, including patterns in eye-gaze, head movements, and speech -social signals that exhibit either established grounding sequences or implicit behavioural responses to failures [6,31,35,76,80]. Behavioural signals have also been examined at unexpected responses from human-robot interactions in the wild [5,30,75], with the use of social signals from low-level sensor input, to highlevel features that represent affect, attention and engagement.…”
Section: Robot Failuresmentioning
confidence: 99%
“…People seem to have various predictable behavioral responses to robotic failures that can be used by robots to identify when a failure has occurred. Failure has been shown to influence users' gaze patterns (Gehle et al, 2015 ; Hayes et al, 2016 ; Mirnig et al, 2017 ), facial expressions (Hayes et al, 2016 ; Mirnig et al, 2017 ), head movements (Hayes et al, 2016 ; Mirnig et al, 2017 ; Trung et al, 2017 ), body movements (Mirnig et al, 2017 ; Trung et al, 2017 ), and verbal communication (Gieselmann, 2006 ; Giuliani et al, 2015 ). Gieselmann ( 2006 ) found that indicators for errors in human-robot conversation included sudden changes of the current dialogue topic, indicating non-understanding by asking unspecific questions, asking for additional information and repeating the previous question.…”
Section: A Unified Information Processing Model For User Centered Faimentioning
confidence: 99%
“…In addition, people respond differently when facing social norm violations and technical failures; in particular, technical failures generally resulted in fewer social signals and faster reaction times [20]. Prior works have demonstrated how social signals such as upper body movements (e.g., [28]), gaze (e.g., [2,3]), and gestures (e.g., [3]) can be used to detect errors effectively. It is worth noting that most of these prior works used human-like robots, which have been shown to elicit different responses to failures than non-humanoid embodiments in social error scenarios [17].…”
Section: Background and Related Workmentioning
confidence: 99%
“…To identify robot errors during situated interactions with people, prior research has investigated how people may respond to a robot's technical and social errors (e.g., [10,13]) and explored how social signals may be used for error detection [28]. To date, research has mostly used anthropomorphic robots when studying human responses to robot errors; not much has explored how people would exhibit social signals in response to errors produced by non-anthropomorphic robots (e.g., manipulators) that are commonly deployed in workplaces, such as factories.…”
Section: Introductionmentioning
confidence: 99%