2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2019
DOI: 10.1109/iros40897.2019.8967927
|View full text |Cite
|
Sign up to set email alerts
|

Rebellion and Obedience: The Effects of Intention Prediction in Cooperative Handheld Robots

Abstract: Within this work, we explore intention inference for user actions in the context of a handheld robot setup. Handheld robots share the shape and properties of handheld tools while being able to process task information and aid manipulation. Here, we propose an intention prediction model to enhance cooperative task solving. The model derives intention from the user's gaze pattern which is captured using a robot-mounted remote eye tracker. The proposed model yields real-time capabilities and reliable accuracy up … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 10 publications
(9 citation statements)
references
References 18 publications
0
9
0
Order By: Relevance
“…If the robot could not arrive at that target in time, they remapped the virtual element to a physical point within the EHD's reachable space. Stolzenwald et al [15] introduced a model that predicts users' interaction location targets based on their eye gaze and task states using a hand-held robot. This model derives intention from the combined information about the user's gaze pattern and task knowledge.…”
Section: User Motion Predictionmentioning
confidence: 99%
See 1 more Smart Citation
“…If the robot could not arrive at that target in time, they remapped the virtual element to a physical point within the EHD's reachable space. Stolzenwald et al [15] introduced a model that predicts users' interaction location targets based on their eye gaze and task states using a hand-held robot. This model derives intention from the combined information about the user's gaze pattern and task knowledge.…”
Section: User Motion Predictionmentioning
confidence: 99%
“…We introduce and compare three strategies to detect human intention using the eye gaze and the hand motion to improve the human immersion.We use the eye-gaze detection rather than the eye-gaze attention used in [2,15,16]; 2.…”
Section: Introductionmentioning
confidence: 99%
“…These features contribute to a decreased workers' task load. Further work has introduced a 6-DoF kinematics design [9] and extensive research investigated robot-human communication for user guidance [10] and the perception of users' attention and intention for improved cooperation [11], [12]. These works were motivated by the question how handheld robots and humans benefit from each others' strengths within a single-user collaborative setup.…”
Section: A Handheld Robotsmentioning
confidence: 99%
“…Handheld robots [8]- [12] are intelligent tools that process task knowledge and environment information, Department of Computer Science, University of Bristol, UK, .... janis.stolzenwald.2015@my.bristol.ac.uk, wmayol@cs.bris.ac.uk which allows for semi-autonomous assistance in collaborative task solving, and combine these with the natural competences of human users for negotiating obstacles and resolving complex motion planning tasks effortlessly. We argue, that such a system could bridge the aforementioned gap between remote guidance and telemanipulation, with the handheld robot helping both the effective communication between the workers and task outcomes.…”
Section: Introductionmentioning
confidence: 99%
“…Despite the potential advantages of the egocentric-based head and eye gesture recognition, this problem has been scarcely addressed in HRI. Exceptions include a system to request and guide a robot to find some object [29], helping navigation to wheelchair users with limited hand mobility [14], or predicting user intention from gaze in grasping tasks [13] and hand-held robots [24]. Most of these works have in common that they seek to assist disabled people.…”
Section: Introductionmentioning
confidence: 99%