2016 International Conference on Unmanned Aircraft Systems (ICUAS) 2016
DOI: 10.1109/icuas.2016.7502665
|View full text |Cite
|
Sign up to set email alerts
|

Natural user interfaces for human-drone multi-modal interaction

Abstract: Personal drones are becoming part of every day life. To fully integrate them into society, it is crucial to design safe and intuitive ways to interact with these aerial systems. The recent advances on User-Centered Design (UCD) applied to Natural User Interfaces (NUIs) intend to make use of human innate features, such as speech, gestures and vision to interact with technology in the way humans would with one another. In this paper, a Graphical User Interface (GUI) and several NUI methods are studied and implem… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
61
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
2
2

Relationship

2
6

Authors

Journals

citations
Cited by 119 publications
(61 citation statements)
references
References 31 publications
0
61
0
Order By: Relevance
“…To achieve a fully autonomous operation, a large number of additional components, which are out of the scope of this paper, have been used. These additional components are: perception and state estimation, Sanchez-Lopez et al [35], Bavle et al [4]; control, Pestana et al [24], Olivares-Mendez et al [23]; mission plan specification, Molina et al [19,20]; multi-robot mission planning, Sampedro et al [31]; and human-machine interfaces, Suárez Fernández et al [40], among others.…”
Section: Methodsmentioning
confidence: 99%
“…To achieve a fully autonomous operation, a large number of additional components, which are out of the scope of this paper, have been used. These additional components are: perception and state estimation, Sanchez-Lopez et al [35], Bavle et al [4]; control, Pestana et al [24], Olivares-Mendez et al [23]; mission plan specification, Molina et al [19,20]; multi-robot mission planning, Sampedro et al [31]; and human-machine interfaces, Suárez Fernández et al [40], among others.…”
Section: Methodsmentioning
confidence: 99%
“…The result of their simulation showed that human operators could interact effectively and reliably with UAVs via multiple modalities of speech and gesture, in autonomous, mixed-initiative, or teleoperation mode. Fernandez [15] investigated the use of natural user interfaces (NUIs) in the control of small UAVs, using a custom Aerostack software framework, which they developed by combining several NUI methods and computer vision techniques. Their project was aimed at studying, implementing, and validating NUIs efficiency in human UAV interaction [15].…”
Section: Multimodal Speech and Gesture Interfacesmentioning
confidence: 99%
“…Fernandez [15] investigated the use of natural user interfaces (NUIs) in the control of small UAVs, using a custom Aerostack software framework, which they developed by combining several NUI methods and computer vision techniques. Their project was aimed at studying, implementing, and validating NUIs efficiency in human UAV interaction [15]. Harris and Barber [16,17] investigated the performance of a speech and gesture multimodal interface for a soldier-robot team communication during an ISR mission.…”
Section: Multimodal Speech and Gesture Interfacesmentioning
confidence: 99%
See 1 more Smart Citation
“…In [22], the authors expanded Aerostack capabilities to demonstrate the benefit of using a coordinator to accomplish highlevel missions requested by the user with a fully autonomous swarm of UAS. In [28] Aerostack was used for research and development of Natural User Interfaces for Human-Drone Interaction using hand gestures, speech, body movements and visual cues.…”
Section: A Reported Uses Of Aerostackmentioning
confidence: 99%