Share Email Print

Proceedings Paper

Neural sensor fusion for spatial visualization on a mobile robot
Author(s): Siegfried Martens; Gail A. Carpenter; Paolo Gaudiano
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

An ARTMAP neural network is used to integrate visual information and ultrasonic sensory information on a B14 mobile robot. Training samples for the neural network are acquired without human intervention. Sensory snapshots are retrospectively associated with the distance to the wall, provided by on-board odometry as the robot travels in a straight line. The goal is to produce a more accurate measure of distance than is provided by the raw sensors. The neural network effectively combines sensory sources both within and between modalities. The improved distance percept is used to produce occupancy grid visualizations of the robot's environment. The maps produced point to specific problems of raw sensory information processing and demonstrate the benefits of using a neural network system for sensor fusion.

Paper Details

Date Published: 9 October 1998
PDF: 12 pages
Proc. SPIE 3523, Sensor Fusion and Decentralized Control in Robotic Systems, (9 October 1998); doi: 10.1117/12.326991
Show Author Affiliations
Siegfried Martens, Boston Univ. (United States)
Gail A. Carpenter, Boston Univ. (United States)
Paolo Gaudiano, Boston Univ. (United States)

Published in SPIE Proceedings Vol. 3523:
Sensor Fusion and Decentralized Control in Robotic Systems
Paul S. Schenker; Gerard T. McKee, Editor(s)

© SPIE. Terms of Use
Back to Top