Robotics: Science and Systems VII 2011
DOI: 10.15607/rss.2011.vii.027
|View full text |Cite
|
Sign up to set email alerts
|

Comparing Heads-up, Hands-free Operation of Ground Robots to Teleoperation

Abstract: Abstract-Today, most commercially available UGVs use teleoperation for control. Under teleoperation, users' hands are occupied holding a handheld controller to operate the UGV, and their attention is focused on what the robot is doing. In this paper, we propose an alternative called Heads-up, Hands-free Operation, which allows an operator to control a UGV using operator following behaviors and a gesture interface. We explore whether Heads-up, Hands-free Operation is an improvement over teleoperation. In a stud… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 36 publications
0
8
0
Order By: Relevance
“…In many practical applications, a human instructs the UGV to perform certain tasks, such as changing its motion or speed, taking photographs, or making phone calls. These instructions are typically communicated using voice commands (Fritsch et al, 2004), hand gestures (Doisy et al, 2013; Marge et al, 2011), or haptic interfaces (Ghosh et al, 2014; Park and Howard, 2010). Moreover, some smart carts and autonomous luggage robots allow users to interact using smartphone applications (DiGiacomcantonio and Gebreyes, 2014).…”
Section: Categorization Of Autonomous Person-following Behaviorsmentioning
confidence: 99%
“…In many practical applications, a human instructs the UGV to perform certain tasks, such as changing its motion or speed, taking photographs, or making phone calls. These instructions are typically communicated using voice commands (Fritsch et al, 2004), hand gestures (Doisy et al, 2013; Marge et al, 2011), or haptic interfaces (Ghosh et al, 2014; Park and Howard, 2010). Moreover, some smart carts and autonomous luggage robots allow users to interact using smartphone applications (DiGiacomcantonio and Gebreyes, 2014).…”
Section: Categorization Of Autonomous Person-following Behaviorsmentioning
confidence: 99%
“…The TeamTalk platform used in this work offers a bidirectional method for interpreting commands by detecting problems with task execution, requesting help, and responding to amended instructions from the human. The interaction modality is spoken dialogue; thus, the human is able to interact in a "heads-up, hands-free" way [69]. When disambiguating referring expressions, TeamTalk uses natural-language generation templates to describe its surroundings.…”
Section: Asking For Helpmentioning
confidence: 99%
“…The majority of gesture systems today focus on gesture recognition [20] which is a classification task that does not require the location or orientation of the gesture [27,18]. Often, this recognition is performed in batch, and has a slightly different goal, namely to identify times in a video clip in which certain gestures occur.…”
Section: Related Workmentioning
confidence: 99%