Unmanned aircraft systems (UASs) have seen a dramatic increase in military operations over the last two decades. The increased demand for their capabilities on the battlefield has resulted in quick fielding with user interfaces that are designed more for engineers in mind than for UAS operators. UAS interfaces tend to support tele-operation with a joystick or complex, menu-driven interfaces that have a steep learning curve. These approaches to control require constant attention to manage a single UAS and require increased heads-down time in an interface to search for and click on the right menus to invoke commands. The time and attention required by these interfaces makes it difficult to increase a single operator's span of control to encompass multiple UAS or the control of sensor systems. In this paper, we explore an alternative interface to the standard menu-based control interfaces. Our approach in this work was to first study how operators might want to task a UAS if they were not constrained by a typical menu interface. Based on this study, we developed a prototype multi-modal dialogue interface for more intuitive control of multiple unmanned aircraft and their sensor systems using speech and map-based gesture/sketch. The system we developed is a two-way interface that allows a user to draw on a map while speaking commands to the system, and which provides feedback to the user to ensure the user knows what the system is doing. When the system does not understand the user for some reason -for example, because speech recognition failed or because the user did not provide enough information -the system engages with the user in a dialogue to gather the information needed to perform the command. With the help of UAS operators, we conducted a user study to compare the performance of our prototype system against a representative menu-based control interface in terms of usability, time on task, and mission effectiveness. This paper describes a study to gather data about how people might use a natural interface, the system itself, and the results of the user study.
Military unmanned systems today are typically controlled by two methods: tele-operation or menu-based, search-andclick interfaces. Both approaches require the operator's constant vigilance: tele-operation requires constant input to drive the vehicle inch by inch; a menu-based interface requires eyes on the screen in order to search through alternatives and select the right menu item. In both cases, operators spend most of their time and attention driving and minding the unmanned systems rather than on being a warfighter. With these approaches, the platform and interface become more of a burden than a benefit. The availability of inexpensive sensor systems in products such as Microsoft Kinect™ or Nintendo Wii™ has resulted in new ways of interacting with computing systems, but new sensors alone are not enough. Developing useful and usable human-system interfaces requires understanding users and interaction in context: not just what new sensors afford in terms of interaction, but how users want to interact with these systems, for what purpose, and how sensors might enable those interactions. Additionally, the system needs to reliably make sense of the user's inputs in context, translate that interpretation into commands for the unmanned system, and give feedback to the user. In this paper, we describe an example natural interface for unmanned systems, called the Smart Interaction Device (SID), which enables natural two-way interaction with unmanned systems including the use of speech, sketch, and gestures. We present a few example applications SID to different types of unmanned systems and different kinds of interactions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.