2010 IEEE/RSJ International Conference on Intelligent Robots and Systems 2010
DOI: 10.1109/iros.2010.5650910
|View full text |Cite
|
Sign up to set email alerts
|

Natural language command of an autonomous micro-air vehicle

Abstract: Abstract-Natural language is a flexible and intuitive modality for conveying directions and commands to a robot but presents a number of computational challenges. Diverse words and phrases must be mapped into structures that the robot can understand, and elements in those structures must be grounded in an uncertain environment.In this paper we present a micro-air vehicle (MAV) capable of following natural language directions through a previously mapped and labeled environment. We extend our previous work in un… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
15
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 28 publications
(16 citation statements)
references
References 16 publications
1
15
0
Order By: Relevance
“…Extensive research on teleoperation of small sized aerial robots with various types of input-devices and according interactions, has been conducted in the past decade. Despite the popularity of novel interaction paradigms, like hand-gesture (Yu et al, 2017 ; Duan et al, 2018 ), body-gesture (Rognon et al, 2018 ), gaze (Yu et al, 2014 ; Erat et al, 2018 ; Yuan et al, 2019 ), and language (Huang et al, 2010 , 2019 ), more recent work still focuses on aspects of traditional joystick based teleoperation of small-sized UAVs, for example avoiding collisions during navigation (Cho et al, 2017 ). Sanders et al ( 2018 ) report that operators still prefer joystick control over indirect gaze-based steering, whereas findings of Herrmann and Schmidt ( 2018 ) indicate that a traditional input device is more efficient than their extensive and carefully designed system based on natural interactions.…”
Section: Related Workmentioning
confidence: 99%
“…Extensive research on teleoperation of small sized aerial robots with various types of input-devices and according interactions, has been conducted in the past decade. Despite the popularity of novel interaction paradigms, like hand-gesture (Yu et al, 2017 ; Duan et al, 2018 ), body-gesture (Rognon et al, 2018 ), gaze (Yu et al, 2014 ; Erat et al, 2018 ; Yuan et al, 2019 ), and language (Huang et al, 2010 , 2019 ), more recent work still focuses on aspects of traditional joystick based teleoperation of small-sized UAVs, for example avoiding collisions during navigation (Cho et al, 2017 ). Sanders et al ( 2018 ) report that operators still prefer joystick control over indirect gaze-based steering, whereas findings of Herrmann and Schmidt ( 2018 ) indicate that a traditional input device is more efficient than their extensive and carefully designed system based on natural interactions.…”
Section: Related Workmentioning
confidence: 99%
“…A quadcopter navigated in indoor environments with human's oral guidance. (d) is social accompany [14]. A pet dog is playing balls with a human with socialized verbal communications.…”
Section: A Backgroundmentioning
confidence: 99%
“…The user may interact with drones through joystick controllers, touchscreens [2,6], body gestures [11,31], natural language commands [21], etc. .…”
Section: B Human-drone Interaction For Drone Navigationmentioning
confidence: 99%