Share Email Print

Proceedings Paper

Reconstructing and visualizing dense global visual maps for extended passive navigation
Author(s): Yuntao Cui; John J. Weng
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

We consider the task of passive navigation, where a stereo visual sensor system moves around an unknown scene. In order to guide an autonomous navigation, it is important to build a visual map which records the location and the shape of the objects in the scene and their world coordinates. The extended global visual map is an integration of local maps. The approach described in this paper integrates the processes of motion estimation, stereo matching, temporal tracking, and Delaunay triangulation interpolation. Through stereo matching, each frame (a stereo image pair) can produce a set of 3D points of the current scene. The global structures of these 3D points are obtained using the results of the motion estimation. Delaunay tetrahedralization interpolates three-dimensional data points with a simplicial polyhedral surface. The experiment includes 151 frames of stereo images acquired from the moving mobile robot.

Paper Details

Date Published: 7 April 1995
PDF: 8 pages
Proc. SPIE 2410, Visual Data Exploration and Analysis II, (7 April 1995); doi: 10.1117/12.205951
Show Author Affiliations
Yuntao Cui, Michigan State Univ. (United States)
John J. Weng, Michigan State Univ. (United States)

Published in SPIE Proceedings Vol. 2410:
Visual Data Exploration and Analysis II
Richard N. Ellson; Georges G. Grinstein; Robert F. Erbacher, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?