Proceedings PaperSensor fusion in a dynamic environment
|Format||Member Price||Non-Member Price|
The recent trend towards dynamic vision has led to the need for real-time performance in various vision and control algorithms. Some of the burden placed on algorithms using purely visual input can be lessened using multiple disparate sensors. Research into the integration of information from disparate sensors while moving through an environment has for the main part concentrated on static environments. Moving obstacles complicate tasks such as avoidance and path planning. In this paper we present a system which integrates range and visual sensory inputs for the dynamic analysis of motion within the field of view of an autonomous platform. The approach we follow combines some recently developed neural network motion analysis algorithms with an epipolar plane image technique. We report the results of some experiments on a synthesized visible/range sequence.