Share Email Print
cover

Proceedings Paper

Gaze interaction in UAS video exploitation
Author(s): Jutta Hild; Stefan Brüstle; Norbert Heinze; Elisabeth Peinsipp-Byma
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

A frequently occurring interaction task in UAS video exploitation is the marking or selection of objects of interest in the video. If an object of interest is visually detected by the image analyst, its selection/marking for further exploitation, documentation and communication with the team is a necessary task. Today object selection is usually performed by mouse interaction. As due to sensor motion all objects in the video move, object selection can be rather challenging, especially if strong and fast and ego-motions are present, e.g., with small airborne sensor platforms. In addition to that, objects of interest are sometimes too shortly visible to be selected by the analyst using mouse interaction. To address this issue we propose an eye tracker as input device for object selection. As the eye tracker continuously provides the gaze position of the analyst on the monitor, it is intuitive to use the gaze position for pointing at an object. The selection is then actuated by pressing a button. We integrated this gaze-based “gaze + key press” object selection into Fraunhofer IOSB's exploitation station ABUL using a Tobii X60 eye tracker and a standard keyboard for the button press. Representing the object selections in a spatial relational database, ABUL enables the image analyst to efficiently query the video data in a post processing step for selected objects of interest with respect to their geographical and other properties. An experimental evaluation is presented, comparing gaze-based interaction with mouse interaction in the context of object selection in UAS videos.

Paper Details

Date Published: 16 May 2013
PDF: 10 pages
Proc. SPIE 8740, Motion Imagery Technologies, Best Practices, and Workflows for Intelligence, Surveillance, and Reconnaissance (ISR), and Situational Awareness, 87400H (16 May 2013); doi: 10.1117/12.2015928
Show Author Affiliations
Jutta Hild, Fraunhofer Institute of Optronics, System Technologies and Image Exploitation (Germany)
Stefan Brüstle, Fraunhofer Institute of Optronics, System Technologies and Image Exploitation (Germany)
Norbert Heinze, Fraunhofer Institute of Optronics, System Technologies and Image Exploitation (Germany)
Elisabeth Peinsipp-Byma, Fraunhofer Institute of Optronics, System Technologies and Image Exploitation (Germany)


Published in SPIE Proceedings Vol. 8740:
Motion Imagery Technologies, Best Practices, and Workflows for Intelligence, Surveillance, and Reconnaissance (ISR), and Situational Awareness
Donnie Self, Editor(s)

© SPIE. Terms of Use
Back to Top