Share Email Print
cover

Proceedings Paper

A visual odometry method based on the SwissRanger SR4000
Author(s): Cang Ye; Michael Bruch
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

This paper presents a pose estimation method based on a 3D camera - the SwissRanger SR4000. The proposed method estimates the camera's ego-motion by using intensity and range data produced by the camera. It detects the SIFT (Scale- Invariant Feature Transform) features in one intensity image and match them to that in the next intensity image. The resulting 3D data point pairs are used to compute the least-square rotation and translation matrices, from which the attitude and position changes between the two image frames are determined. The method uses feature descriptors to perform feature matching. It works well with large image motion between two frames without the need of spatial correlation search. Due to the SR4000's consistent accuracy in depth measurement, the proposed method may achieve a better pose estimation accuracy than a stereovision-based approach. Another advantage of the proposed method is that the range data of the SR4000 is complete and therefore can be used for obstacle avoidance/negotiation. This makes it possible to navigate a mobile robot by using a single perception sensor. In this paper, we will validate the idea of the pose estimation method and characterize the method's pose estimation performance.

Paper Details

Date Published: 7 May 2010
PDF: 9 pages
Proc. SPIE 7692, Unmanned Systems Technology XII, 76921I (7 May 2010); doi: 10.1117/12.850349
Show Author Affiliations
Cang Ye, Univ. of Arkansas at Little Rock (United States)
Michael Bruch, Space and Naval Warfare Systems Ctr. Pacific (United States)


Published in SPIE Proceedings Vol. 7692:
Unmanned Systems Technology XII
Grant R. Gerhart; Douglas W. Gage; Charles M. Shoemaker, Editor(s)

© SPIE. Terms of Use
Back to Top