Share Email Print
cover

Proceedings Paper

Utilization and viability of biologically inspired algorithms in a dynamic multiagent camera surveillance system
Author(s): Terrell N Mundhenk; Nitin Dhavale; Salvador Marmol; Elizabeth Calleja; Vidhya Navalpakkam; Kirstie Bellman; Chris Landauer; Michael A Arbib; Laurent Itti
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

In view of the growing complexity of computational tasks and their design, we propose that certain interactive systems may be better designed by utilizing computational strategies based on the study of the human brain. Compared with current engineering paradigms, brain theory offers the promise of improved self-organization and adaptation to the current environment, freeing the programmer from having to address those issues in a procedural manner when designing and implementing large-scale complex systems. To advance this hypothesis, we discus a multi-agent surveillance system where 12 agent CPUs each with its own camera, compete and cooperate to monitor a large room. To cope with the overload of image data streaming from 12 cameras, we take inspiration from the primate’s visual system, which allows the animal to operate a real-time selection of the few most conspicuous locations in visual input. This is accomplished by having each camera agent utilize the bottom-up, saliency-based visual attention algorithm of Itti and Koch (Vision Research 2000;40(10-12):1489-1506) to scan the scene for objects of interest. Real time operation is achieved using a distributed version that runs on a 16-CPU Beowulf cluster composed of the agent computers. The algorithm guides cameras to track and monitor salient objects based on maps of color, orientation, intensity, and motion. To spread camera view points or create cooperation in monitoring highly salient targets, camera agents bias each other by increasing or decreasing the weight of different feature vectors in other cameras, using mechanisms similar to excitation and suppression that have been documented in electrophysiology, psychophysics and imaging studies of low-level visual processing. In addition, if cameras need to compete for computing resources, allocation of computational time is weighed based upon the history of each camera. A camera agent that has a history of seeing more salient targets is more likely to obtain computational resources. The system demonstrates the viability of biologically inspired systems in a real time tracking. In future work we plan on implementing additional biological mechanisms for cooperative management of both the sensor and processing resources in this system that include top down biasing for target specificity as well as novelty and the activity of the tracked object in relation to sensitive features of the environment.

Paper Details

Date Published: 30 September 2003
PDF: 12 pages
Proc. SPIE 5267, Intelligent Robots and Computer Vision XXI: Algorithms, Techniques, and Active Vision, (30 September 2003); doi: 10.1117/12.515176
Show Author Affiliations
Terrell N Mundhenk, Univ. of Southern California (United States)
The Aerospace Corp. (United States)
Nitin Dhavale, Univ. of Southern California (United States)
Salvador Marmol, Univ. of Southern California (United States)
Elizabeth Calleja, Univ. of Southern California (United States)
Vidhya Navalpakkam, Univ. of Southern California (United States)
Kirstie Bellman, The Aerospace Corp. (United States)
Chris Landauer, The Aerospace Corp. (United States)
Michael A Arbib, Univ. of Southern California (United States)
Laurent Itti, Univ. of Southern California (United States)


Published in SPIE Proceedings Vol. 5267:
Intelligent Robots and Computer Vision XXI: Algorithms, Techniques, and Active Vision
David P. Casasent; Ernest L. Hall; Juha Roning, Editor(s)

© SPIE. Terms of Use
Back to Top