Share Email Print
cover

Proceedings Paper • new

Deep convolutional neural network target classification for underwater synthetic aperture sonar imagery
Author(s): A. Galusha; J. Dale; J. M. Keller; A. Zare
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

In underwater synthetic aperture sonar (SAS) imagery, there is a need for accurate target recognition algorithms. Automated detection of underwater objects has many applications, not the least of which being the safe extraction of dangerous explosives. In this paper, we discuss experiments on a deep learning approach to binary classification of target and non-target SAS image tiles. Using a fused anomaly detector, the pixels in each SAS image have been narrowed down into regions of interest (ROIs), from which small target-sized tiles are extracted. This tile data set is created prior to the work done in this paper. Our objective is to carry out extensive tests on the classification accuracy of deep convolutional neural networks (CNNs) using location-based cross validation. Here we discuss the results of varying network architectures, hyperparameters, loss, and activation functions; in conjunction with an analysis of training and testing set configuration. It is also in our interest to analyze these unique network setups extensively, rather than comparing merely classification accuracy. The approach is tested on a collection of SAS imagery.

Paper Details

Date Published: 10 May 2019
PDF: 11 pages
Proc. SPIE 11012, Detection and Sensing of Mines, Explosive Objects, and Obscured Targets XXIV, 1101205 (10 May 2019); doi: 10.1117/12.2519521
Show Author Affiliations
A. Galusha, Univ. of Missouri (United States)
J. Dale, Univ. of Missouri (United States)
J. M. Keller, Univ. of Missouri (United States)
A. Zare, Univ. of Florida (United States)


Published in SPIE Proceedings Vol. 11012:
Detection and Sensing of Mines, Explosive Objects, and Obscured Targets XXIV
Steven S. Bishop; Jason C. Isaacs, Editor(s)

© SPIE. Terms of Use
Back to Top