Share Email Print
cover

Proceedings Paper

Dynamic Bayes net approach to multimodal sensor fusion
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Autonomous mobile robots rely on multiple sensors to perform a varied number of tasks in a given environment. Different tasks may need different sensors to estimate different subsets of world state. Also, different sensors can cooperate in discovering common subsets of world state. This paper presents a new approach to multimodal sensor fusion using dynamic Bayesian networks and an occupancy grid. The environment in which the robot operates is represented with an occupancy grid. This occupancy grid is asynchronously updated using probabilistic data obtained from multiple sensors and combined using Bayesian networks. Each cell in the occupancy grid stores multiple probability density functions representing combined evidence for the identity, location and properties of objects in the world. The occupancy grid also contains probabilistic representations for moving objects. Bayes nets allow information from one modality to provide cues for interpreting the output of sensors in other modalities. Establishing correlations or associations between sensor readings or interpretations leads to learning the conditional relationships between them. Thus bottoms-up, reflexive, or even accidentally-obtained information can provide tops-down cues for other sensing strategies. We present early results obtained for a mobile robot navigation task.

Paper Details

Date Published: 22 September 1997
PDF: 9 pages
Proc. SPIE 3209, Sensor Fusion and Decentralized Control in Autonomous Robotic Systems, (22 September 1997); doi: 10.1117/12.287628
Show Author Affiliations
Amit Singhal, Univ. of Rochester (United States)
Christopher R. Brown, Univ. of Rochester (United States)


Published in SPIE Proceedings Vol. 3209:
Sensor Fusion and Decentralized Control in Autonomous Robotic Systems
Paul S. Schenker; Gerard T. McKee, Editor(s)

© SPIE. Terms of Use
Back to Top