Share Email Print

Proceedings Paper

Multimodal real-world mapping and navigation system for autonomous mobile robots based on neural maps
Author(s): Jose L. Contreras-Vidal; J. Mario Aguilar; Juan Lopez-Coronado; Eduardo Zalama
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

This work describes a neural network-based approach to multimodal real-world mapping and navigation for autonomous mobile robots in unknown environments. The system is built on top of a vector associative map to combine range data from stereo vision and ultrasonic rangefinders. Visual output from a boundary contour system is used to extract range data from a pair of 2-D images. In addition, range data from ultrasonic lasers is used to eliminate uncertainties, noise, and intrinsic errors introduced by the measurements. A recurrent competitive field used to model multimodal working memory excites a trajectory formation network which transforms desired temporal patterns (i.e., a trajectory formation pattern) into spatial patterns. The output of this network is processed by direction-sensitive cells which in turn activates the motor system that guides a mobile robot in unstructured environments. The model is capable of unsupervised, real-time, fast error-based learning of an unstructured environment.

Paper Details

Date Published: 16 September 1992
PDF: 10 pages
Proc. SPIE 1709, Applications of Artificial Neural Networks III, (16 September 1992); doi: 10.1117/12.140003
Show Author Affiliations
Jose L. Contreras-Vidal, Boston Univ. (United States)
J. Mario Aguilar, Boston Univ. (United States)
Juan Lopez-Coronado, Univ. of Valladolid (Spain)
Eduardo Zalama, Univ. of Valladolid (Spain)

Published in SPIE Proceedings Vol. 1709:
Applications of Artificial Neural Networks III
Steven K. Rogers, Editor(s)

© SPIE. Terms of Use
Back to Top