Share Email Print
cover

Proceedings Paper

Dynamic composition of tracking primitives for interactive vision-guided navigation
Author(s): Darius Burschka; Gregory D. Hager
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

We present a system architecture for robust target following with a mobile robot. The system is based on tracking multiple cues in binocular stereo images using the XVision toolkit. Fusion of complementary information in the images, including texture, color and depth, combined with a fast optimized processing reduces the possibility of loosing the tracked object in a dynamic scene with several moving targets on intersecting paths. The presented system is capable of detecting objects obstructing its way as well as gaps. It supports application in more cluttered terrain, where a wheel drive of mobile robot cannot take the same path as a walking person. We describe the basic principles of the fast feature extraction and tracking in the luminance, chrominance and disparity domain. The optimized tracking algorithms compensate for illumination variations and perspective distortions as already presented in our previous publications about the XVision system.

Paper Details

Date Published: 18 February 2002
PDF: 12 pages
Proc. SPIE 4573, Mobile Robots XVI, (18 February 2002); doi: 10.1117/12.457436
Show Author Affiliations
Darius Burschka, Johns Hopkins Univ. (United States)
Gregory D. Hager, Johns Hopkins Univ. (United States)


Published in SPIE Proceedings Vol. 4573:
Mobile Robots XVI
Douglas W. Gage; Howie M. Choset, Editor(s)

© SPIE. Terms of Use
Back to Top