Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction 2020
DOI: 10.1145/3319502.3374782
|View full text |Cite
|
Sign up to set email alerts
|

Behavioural Responses to Robot Conversational Failures

Abstract: Figure 1: A human user instructing a robot dual-arm to pick-and-place objects: a) the human utters an instruction, b) the robot attempts to grasp the object, c) the robot indicates incapability through sudden arm movement. Even though the robot does not have a head and cannot speak, it affords interactional phenomena through non-verbal behaviour. Experiment published at [26].

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
18
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 31 publications
(18 citation statements)
references
References 61 publications
(37 reference statements)
0
18
0
Order By: Relevance
“…Behavioural signals have also been examined at unexpected responses from human-robot interactions in the wild [5,30,75], with the use of social signals from low-level sensor input, to highlevel features that represent affect, attention and engagement. Research has also showed that users tend to enact different behavioural responses to failures from human-like robots in contrast to smart-speaker embodiments [49].…”
Section: Robot Failuresmentioning
confidence: 99%
“…Behavioural signals have also been examined at unexpected responses from human-robot interactions in the wild [5,30,75], with the use of social signals from low-level sensor input, to highlevel features that represent affect, attention and engagement. Research has also showed that users tend to enact different behavioural responses to failures from human-like robots in contrast to smart-speaker embodiments [49].…”
Section: Robot Failuresmentioning
confidence: 99%
“…Prior works have demonstrated how social signals such as upper body movements (e.g., [28]), gaze (e.g., [2,3]), and gestures (e.g., [3]) can be used to detect errors effectively. It is worth noting that most of these prior works used human-like robots, which have been shown to elicit different responses to failures than non-humanoid embodiments in social error scenarios [17]. Furthermore, prior research has mostly focused on social interaction scenarios or settings where robots serve as experts or leaders to guide humans through tasks.…”
Section: Background and Related Workmentioning
confidence: 99%
“…The mean time between technical failures for robots in the wild is often less than a few hours, and current HRI systems are not yet mature enough to effectively handle unexpected events [70]. Therefore, HRI errors have attracted growing interest in the research community, with recent studies focusing on data collection and experiments contributing to the understanding of how errors influence a user's perceptions and behavioral responses, as well as effective recovery strategies [90]. We survey existing studies on HRI errors with a user-centered perspective in Section 3.1.…”
Section: Related Work On Hri Errorsmentioning
confidence: 99%