Share Email Print

Proceedings Paper

Video-based realtime IMU-camera calibration for robot navigation
Author(s): Arne Petersen; Reinhard Koch
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

This paper introduces a new method for fast calibration of inertial measurement units (IMU) with cameras being rigidly coupled. That is, the relative rotation and translation between the IMU and the camera is estimated, allowing for the transfer of IMU data to the cameras coordinate frame. Moreover, the IMUs nuisance parameters (biases and scales) and the horizontal alignment of the initial camera frame are determined. Since an iterated Kalman Filter is used for estimation, information on the estimations precision is also available. Such calibrations are crucial for IMU-aided visual robot navigation, i.e. SLAM, since wrong calibrations cause biases and drifts in the estimated position and orientation. As the estimation is performed in realtime, the calibration can be done using a freehand movement and the estimated parameters can be validated just in time. This provides the opportunity of optimizing the used trajectory online, increasing the quality and minimizing the time effort for calibration. Except for a marker pattern, used for visual tracking, no additional hardware is required. As will be shown, the system is capable of estimating the calibration within a short period of time. Depending on the requested precision trajectories of 30 seconds to a few minutes are sufficient. This allows for calibrating the system at startup. By this, deviations in the calibration due to transport and storage can be compensated. The estimation quality and consistency are evaluated in dependency of the traveled trajectories and the amount of IMU-camera displacement and rotation misalignment. It is analyzed, how different types of visual markers, i.e. 2- and 3-dimensional patterns, effect the estimation. Moreover, the method is applied to mono and stereo vision systems, providing information on the applicability to robot systems. The algorithm is implemented using a modular software framework, such that it can be adopted to altered conditions easily.

Paper Details

Date Published: 1 May 2012
PDF: 10 pages
Proc. SPIE 8437, Real-Time Image and Video Processing 2012, 843706 (1 May 2012); doi: 10.1117/12.924066
Show Author Affiliations
Arne Petersen, Christian-Albrechts-Univ. zu Kiel (Germany)
Reinhard Koch, Christian-Albrechts-Univ. zu Kiel (Germany)

Published in SPIE Proceedings Vol. 8437:
Real-Time Image and Video Processing 2012
Nasser Kehtarnavaz; Matthias F. Carlsohn, Editor(s)

© SPIE. Terms of Use
Back to Top