Share Email Print
cover

Proceedings Paper • new

Conditional generative adversarial networks for data augmentation and adaptation in remotely sensed imagery
Author(s): Jonathan Howe; Kyle Pula; Aaron A. Reite
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

The difficulty in obtaining labeled data relevant to a given task is among the most common and well-known practical obstacles to applying deep learning techniques to new or even slightly modified domains. The data volumes required by the current generation of supervised learning algorithms typically far exceed what a human needs to learn and complete a given task. We investigate ways to expand a given labeled corpus of remote sensed imagery into a larger corpus using Generative Adversarial Networks (GANs). We then measure how these additional synthetic data affect supervised machine learning performance on an object detection task.

Our data driven strategy is to train GANs to (1) generate synthetic segmentation masks and (2) generate plausible synthetic remote sensing imagery corresponding to these segmentation masks. Run sequentially, these GANs allow the generation of synthetic remote sensing imagery complete with segmentation labels. We apply this strategy to the data set from ISPRS' 2D Semantic Labeling Contest - Potsdam, with a follow on vehicle detection task. We find that in scenarios with limited training data, augmenting the available data with such synthetically generated data can improve detector performance.

Paper Details

Date Published: 6 September 2019
PDF: 13 pages
Proc. SPIE 11139, Applications of Machine Learning, 111390G (6 September 2019); doi: 10.1117/12.2529586
Show Author Affiliations
Jonathan Howe, NVIDIA Corp. (United States)
Kyle Pula, CACI International Inc. (United States)
Aaron A. Reite, NGA Research (United States)


Published in SPIE Proceedings Vol. 11139:
Applications of Machine Learning
Michael E. Zelinski; Tarek M. Taha; Jonathan Howe; Abdul A. S. Awwal; Khan M. Iftekharuddin, Editor(s)

© SPIE. Terms of Use
Back to Top