2014
DOI: 10.14236/ewic/hci2014.34
|View full text |Cite
|
Sign up to set email alerts
|

Entertainment Multi-rotor Robot that realises Direct and Multimodal Interaction

Abstract: We explore direct interaction of human and multi-rotor robots and its applications for entertainment. In this paper, we present our system that realises direct and multimodal interaction using onboard cameras and a microphone. With these onboard sensors to detect human actions, the robots' reaction chains and expands one after another. In addition, as all the processing is executed within the onboard computer, there is no need to use external devices. We describe its interaction scenario from take-off to landi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 16 publications
0
2
0
Order By: Relevance
“…Users that tried the system provided positive feedback, and most of them expressed that talking to a drone was like interacting with a pet. Speech has also been used as an input channel for multimodal ground control stations [33], and direct multimodal interaction [29], [34]. As these projects include more than one interaction technique, they are further explained in the multimodal section below.…”
Section: B Speechmentioning
confidence: 99%
See 1 more Smart Citation
“…Users that tried the system provided positive feedback, and most of them expressed that talking to a drone was like interacting with a pet. Speech has also been used as an input channel for multimodal ground control stations [33], and direct multimodal interaction [29], [34]. As these projects include more than one interaction technique, they are further explained in the multimodal section below.…”
Section: B Speechmentioning
confidence: 99%
“…A summary of the papers related to multimodal control can be seen in Table 8, along with the prototype specifications, in which control modalities are involved and performed user studies. A multi-modal approach can be used to create a direct interaction with drones, for example by taking off and landing through speech and controlling movement by gesture [29], [34]. In these studies, a quadcopter prototype can be controlled solely by using onboard sensors to detect gesture and speech interactions, without the need for any external devices.…”
Section: Multimodalmentioning
confidence: 99%