Share Email Print
cover

Proceedings Paper

Experimental application of simulation tools for evaluating UAV video change detection
Author(s): Günter Saur; Jan Bartelsen
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Change detection is one of the most important tasks when unmanned aerial vehicles (UAV) are used for video reconnaissance and surveillance. In this paper, we address changes on short time scale, i.e. the observations are taken within time distances of a few hours. Each observation is a short video sequence corresponding to the near-nadir overflight of the UAV above the interesting area and the relevant changes are e.g. recently added or removed objects. The change detection algorithm has to distinguish between relevant and non-relevant changes. Examples for non-relevant changes are versatile objects like trees and compression or transmission artifacts. To enable the usage of an automatic change detection within an interactive workflow of an UAV video exploitation system, an evaluation and assessment procedure has to be performed. Large video data sets which contain many relevant objects with varying scene background and altering influence parameters (e.g. image quality, sensor and flight parameters) including image metadata and ground truth data are necessary for a comprehensive evaluation. Since the acquisition of real video data is limited by cost and time constraints, from our point of view, the generation of synthetic data by simulation tools has to be considered. In this paper the processing chain of Saur et al. (2014) [1] and the interactive workflow for video change detection is described. We have selected the commercial simulation environment Virtual Battle Space 3 (VBS3) to generate synthetic data. For an experimental setup, an example scenario “road monitoring” has been defined and several video clips have been produced with varying flight and sensor parameters and varying objects in the scene. Image registration and change mask extraction, both components of the processing chain, are applied to corresponding frames of different video clips. For the selected examples, the images could be registered, the modelled changes could be extracted and the artifacts of the image rendering considered as noise (slight differences of heading angles, disparity of vegetation, 3D parallax) could be suppressed. We conclude that these image data could be considered to be realistic enough to serve as evaluation data for the selected processing components. Future work will extend the evaluation to other influence parameters and may include the human operator for mission planning and sensor control.

Paper Details

Date Published: 21 October 2015
PDF: 7 pages
Proc. SPIE 9653, Target and Background Signatures, 96530P (21 October 2015); doi: 10.1117/12.2197348
Show Author Affiliations
Günter Saur, Fraunhofer-Institut für Optronik, Systemtechnik und Bildauswertung (Germany)
Jan Bartelsen, Fraunhofer-Institut für Optronik, Systemtechnik und Bildauswertung (Germany)


Published in SPIE Proceedings Vol. 9653:
Target and Background Signatures
Karin U. Stein; Ric H. M. A. Schleijpen, Editor(s)

© SPIE. Terms of Use
Back to Top