Share Email Print

Proceedings Paper

Real-time high-performance attention focusing in outdoors color video streams
Author(s): Laurent Itti
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

When confronted with cluttered natural environments, animals still perform orders of magnitude better than artificial vision systems in visual tasks such as orienting, target detection, navigation and scene understanding. To better understand biological visual processing, we have developed a neuromorphic model of how our visual attention is attracted towards conspicuous locations in a visual scene. It replicates processing in the dorsal ('where') visual stream in the primate brain. The model includes a bottom-up (image-based) computation of low-level color, intensity, orientation and flicker features, as well as a nonlinear spatial competition that enhances salient locations in each feature channel. All feature channels feed into a unique scalar 'saliency map' which controls where to next focus attention onto. In this article, we discuss a parallel implementation of the model which runs at 30 frames/s on a 16-CPU Beowulf cluster, and the role of flicker (temporal derivatives) cues in computing salience. We show how our simple within-feature competition for salience effectively suppresses strong but spatially widespread motion transients resulting from egomotion. The model robustly detects salient targets in live outdoors video streams, despite large variations in illumination, clutter, and rapid egomotion. The success of this approach suggests that neuromorphic vision algorithms may prove unusually robust for outdoors vision applications.

Paper Details

Date Published: 30 May 2002
PDF: 9 pages
Proc. SPIE 4662, Human Vision and Electronic Imaging VII, (30 May 2002); doi: 10.1117/12.469519
Show Author Affiliations
Laurent Itti, Univ. of Southern California (United States)

Published in SPIE Proceedings Vol. 4662:
Human Vision and Electronic Imaging VII
Bernice E. Rogowitz; Thrasyvoulos N. Pappas, Editor(s)

© SPIE. Terms of Use
Back to Top