Share Email Print

Proceedings Paper

UAV vision-based localization techniques using high-altitude images and barometric altimeter
Author(s): K. Yawata; T. Yamamoto; J. Watanabe; Y. Nishikawa
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Position information of unmanned aerial vehicles (UAVs) and objects is important for inspections conducted with UAVs. The accuracy with which changes in object to be inspected are detected depends on the accuracy of the past object data being compared; therefore, accurate position recording is important. A global positioning system (GPS) is commonly used as a tool for estimating position, but its accuracy is sometimes insufficient. Therefore, other methods have been proposed, such as visual simultaneous localization and mapping (visual SLAM), which uses monocular camera data to reconstruct a 3D model of a scene and simultaneously estimates the trajectories of the camera using only photos or videos.

In visual SLAM, UAV position is estimated on the basis of stereo vision (localization), and 3D points are mapped on the basis of the estimated UAV position (mapping). Processing is implemented sequentially between localization and mapping. Finally, all the UAV positions are estimated and an integrated 3D map is created. For any given iteration in the sequential processing, there will be estimation error, but in the next iteration, the previous estimated position will be used as a base position regardless of this error. As a result, error accumulates until the UAV returns to a location it passed before. Our research aims to mitigate this problem. We propose two new methods.

(1) Accumulated error caused by local matching with sequential low-altitude images (i.e. close-up photos) is corrected with global-matching between low- and high-altitude images. To perform global-matching that is robust against error, we implemented a method wherein the expected matching areas are narrowed down on the basis of UAV position and barometric altimeter measurements.

(2) Under the assumption that absolute coordinates include axis-rotation error, we proposed an error-reduction method that minimizes the difference in the UAVs’ altitude between the visual SLAM and sensor (bolometer and thermometer) results.

The proposed methods reduced accumulated error by using high-altitude images and sensors. Our methods improve the accuracy of UAV- and object-position estimation.

Paper Details

Date Published: 3 May 2018
PDF: 10 pages
Proc. SPIE 10640, Unmanned Systems Technology XX, 106400K (3 May 2018); doi: 10.1117/12.2302401
Show Author Affiliations
K. Yawata, Hitachi, Ltd. (Japan)
T. Yamamoto, Hitachi, Ltd. (Japan)
J. Watanabe, Hitachi, Ltd. (Japan)
Y. Nishikawa, Hitachi, Ltd. (Japan)

Published in SPIE Proceedings Vol. 10640:
Unmanned Systems Technology XX
Robert E. Karlsen; Douglas W. Gage; Charles M. Shoemaker; Hoa G. Nguyen, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?