2017
DOI: 10.1109/thms.2017.2647882
|View full text |Cite
|
Sign up to set email alerts
|

Implicit Intention Communication in Human–Robot Interaction Through Visual Behavior Studies

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
35
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 61 publications
(36 citation statements)
references
References 47 publications
0
35
0
Order By: Relevance
“…In the early 2000's, the eyetracker was used as a direct substitute for a handheld mouse such that the gaze point on a computer display designates the cursor's position, and blinks function as button clicks (Lin et al, 2006 ; Gajwani and Chhabria, 2010 ). Since 2015, eye gaze has been used to communicate a 3D target position (Li et al, 2015a , 2017 ; Dziemian et al, 2016 ; Li and Zhang, 2017 ; Wang et al, 2018 ; Zeng et al, 2020 ) for directing the movement of the robotic end effector. No action recognition was required, as these methods assumed specific actions in advance, such as reach and grasp (Li et al, 2017 ), write and draw (Dziemian et al, 2016 ), and pick and place (Wang et al, 2018 ).…”
Section: Related Workmentioning
confidence: 99%
“…In the early 2000's, the eyetracker was used as a direct substitute for a handheld mouse such that the gaze point on a computer display designates the cursor's position, and blinks function as button clicks (Lin et al, 2006 ; Gajwani and Chhabria, 2010 ). Since 2015, eye gaze has been used to communicate a 3D target position (Li et al, 2015a , 2017 ; Dziemian et al, 2016 ; Li and Zhang, 2017 ; Wang et al, 2018 ; Zeng et al, 2020 ) for directing the movement of the robotic end effector. No action recognition was required, as these methods assumed specific actions in advance, such as reach and grasp (Li et al, 2017 ), write and draw (Dziemian et al, 2016 ), and pick and place (Wang et al, 2018 ).…”
Section: Related Workmentioning
confidence: 99%
“…1) was applied, which had successfully been applied in some previous research. [15][16][17][18][19][20][21] It is an ultra-portable device (320 x 45 x 40 mm, 145 g) that can move 25 cm horizontally, 11 cm vertically and 15 cm in-depth; it can be fitted on the monitor and uses infra camera observation and image procession to detect and follow eye movement with 60 Hz sample rate. It is easy to handle with 0.5-1 degree of visual angle and 5-or 9 point-calibration options.…”
Section: The Gazepointgp3 Eye-tracker Hardware Unitmentioning
confidence: 99%
“…Joint action in humans Feinman et al, 1992;Chartrand and Bargh, 1999;Driver et al, 1999;Tomasello, 2000;Moore and D'Entremont, 2001;Sebanz et al, 2003Sebanz et al, , 2006Reed et al, 2006;Warneken and Tomasello, 2006;Ganesh et al, 2014;Sawers et al, 2017;Mojtahedi et al, 2017a,b Joint action in human robot interaction Breazeal et al, 2005;Lawitzky et al, 2010;Wang et al, 2013;Magrini et al, 2015;Rozo et al, 2015;Li and Zhang, 2017 Implicit communication in robot-robot applications Two robots: Aiyama et al, 1999;Pereira et al, 2002 Multiple robots: Martinoli and Easton, 2003;Groß and Dorigo, 2004;Groß et al, 2006;Ducatelle et al, 2011;Winfield and Erbas, 2011;Wang and Schwager, 2016 Work exploring the use of explicit communication in robotic systems has been omitted, as explicit communication protocols are so widespread that they are fundamental to most robotics research.…”
Section: Category Publicationsmentioning
confidence: 99%