Drones are becoming more popular within military applications and civil aviation by hobbyists and business. Achieving a natural Human-Drone Interaction (HDI) would enable unskilled drone pilots to take part in the flying of these devices and more generally easy the use of drones. The research within this paper focuses on the design and development of a Natural User Interface (NUI) allowing a user to pilot a drone with body gestures. A Microsoft Kinect was used to capture the user’s body information which was processed by a motion recognition algorithm and converted into commands for the drone. The implementation of a Graphical User Interface (GUI) gives feedback to the user. Visual feedback from the drone’s onboard camera is provided on a screen and an interactive menu controlled by body gestures and allowing the choice of functionalities such as photo and video capture or take-off and landing has been implemented. This research resulted in an efficient and functional system, more instinctive, natural, immersive and fun than piloting using a physical controller, including innovative aspects such as the implementation of additional functionalities to the drone's piloting and control of the flight speed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.