2012
DOI: 10.1007/978-3-642-34103-8_37
|View full text |Cite
|
Sign up to set email alerts
|

User-Defined Body Gestures for Navigational Control of a Humanoid Robot

Abstract: This paper presents a study that allows users to define intuitive gestures to navigate a humanoid robot. For eleven navigational commands, 385 gestures, performed by 35 participants, were analyzed. The results of the study reveal user-defined gesture sets for both novice users and expert users. In addition, we present, a taxonomy of the userdefined gesture sets, agreement scores for the gesture sets, time performances of the gesture motions, and present implications to the design of the robot control, with a f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
37
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 34 publications
(38 citation statements)
references
References 9 publications
1
37
0
Order By: Relevance
“…The system never recognized a wrong gesture, so th verall, we got positive feedback about the integration, eract by using the new gesture set. istribution of our gesture set is concerned, we got qu rison to our previous work that investigated user-defi umanoid robot [14]. We got much less deictic gestures a re metaphoric ones.…”
Section: Discussionmentioning
confidence: 95%
See 3 more Smart Citations
“…The system never recognized a wrong gesture, so th verall, we got positive feedback about the integration, eract by using the new gesture set. istribution of our gesture set is concerned, we got qu rison to our previous work that investigated user-defi umanoid robot [14]. We got much less deictic gestures a re metaphoric ones.…”
Section: Discussionmentioning
confidence: 95%
“…Researchers already adopted this process for other areas, e.g. Kurdyukova et al [11] used it to design gestures for transferring data between tablet computers and a multi-display environment, and Obaid et al [14] applied the process to create a full body gesture set for the navigational control of humanoid robots. In this paper, we adapt the process by Wobbrock et al to identify intuitive gestures for an interactive storytelling scenario.…”
Section: User Defined Gesturesmentioning
confidence: 99%
See 2 more Smart Citations
“…An approach that employs a user-defined gesture set has been presented by Wobbrock et al for surface computing [14], and was adapted for other areas, such as public displays [9] or human robot interaction [11]. Its basic idea is to show specific effects within a system to users, who are then asked to perform gestures that should trigger these effects.…”
Section: Introductionmentioning
confidence: 99%