Share Email Print
cover

Proceedings Paper

Multi-modal interaction for UAS control
Author(s): Glenn Taylor; Ben Purman; Paul Schermerhorn; Guillermo Garcia-Sampedro; Robert Hubal; Kathleen Crabtree; Allen Rowe; Sarah Spriggs
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

Unmanned aircraft systems (UASs) have seen a dramatic increase in military operations over the last two decades. The increased demand for their capabilities on the battlefield has resulted in quick fielding with user interfaces that are designed more for engineers in mind than for UAS operators. UAS interfaces tend to support tele-operation with a joystick or complex, menu-driven interfaces that have a steep learning curve. These approaches to control require constant attention to manage a single UAS and require increased heads-down time in an interface to search for and click on the right menus to invoke commands. The time and attention required by these interfaces makes it difficult to increase a single operator’s span of control to encompass multiple UAS or the control of sensor systems. In this paper, we explore an alternative interface to the standard menu-based control interfaces. Our approach in this work was to first study how operators might want to task a UAS if they were not constrained by a typical menu interface. Based on this study, we developed a prototype multi-modal dialogue interface for more intuitive control of multiple unmanned aircraft and their sensor systems using speech and map-based gesture/sketch. The system we developed is a two-way interface that allows a user to draw on a map while speaking commands to the system, and which provides feedback to the user to ensure the user knows what the system is doing. When the system does not understand the user for some reason – for example, because speech recognition failed or because the user did not provide enough information – the system engages with the user in a dialogue to gather the information needed to perform the command. With the help of UAS operators, we conducted a user study to compare the performance of our prototype system against a representative menu-based control interface in terms of usability, time on task, and mission effectiveness. This paper describes a study to gather data about how people might use a natural interface, the system itself, and the results of the user study. Keywords: UAS control, natural interfaces, multi-modal interaction.

Paper Details

Date Published: 22 May 2015
PDF: 8 pages
Proc. SPIE 9468, Unmanned Systems Technology XVII, 946802 (22 May 2015); doi: 10.1117/12.2180020
Show Author Affiliations
Glenn Taylor, Soar Technology, Inc. (United States)
Ben Purman, Soar Technology, Inc. (United States)
Paul Schermerhorn, Soar Technology, Inc. (United States)
Guillermo Garcia-Sampedro, Soar Technology, Inc. (United States)
Robert Hubal, Soar Technology, Inc. (United States)
Kathleen Crabtree, Booz Allen Hamilton Inc. (United States)
Allen Rowe, Air Force Research Lab. (United States)
Sarah Spriggs, Air Force Research Lab. (United States)


Published in SPIE Proceedings Vol. 9468:
Unmanned Systems Technology XVII
Robert E. Karlsen; Douglas W. Gage; Charles M. Shoemaker; Grant R. Gerhart, Editor(s)

© SPIE. Terms of Use
Back to Top