Share Email Print

Proceedings Paper

Neuromorphic algorithms for computer vision and attention
Author(s): Florence Miau; Constantine S. Papageorgiou; Laurent Itti
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

We describe an integrated vision system which reliably detects persons in static color natural scenes, or other targets among distracting objects. The system is built upon the biologically-inspired synergy between two processing stages: A fast trainable visual attention front-end (where), which rapidly selects a restricted number of conspicuous image locations, and a computationally expensive object recognition back-end (what), which determines whether the selected locations are targets of interest. We experiment with two recognition back-ends: One uses a support vector machine algorithm and achieves highly reliable recognition of pedestrians in natural scenes, but is not particularly biologically plausible, while the other is directly inspired from the neurobiology of inferotemporal cortex, but is not yet as robust with natural images. Integrating the attention and recognition algorithms yields substantial speedup over exhaustive search, while preserving detection rate. The success of this approach demonstrates that using a biological attention-based strategy to guide an object recognition system may represent an efficient strategy for rapid scene analysis.

Paper Details

Date Published: 14 November 2001
PDF: 12 pages
Proc. SPIE 4479, Applications and Science of Neural Networks, Fuzzy Systems, and Evolutionary Computation IV, (14 November 2001); doi: 10.1117/12.448343
Show Author Affiliations
Florence Miau, Univ. of Southern California (United States)
Constantine S. Papageorgiou, MIT Artificial Intelligence Lab. (United States)
Laurent Itti, Univ. of Southern California (United States)

Published in SPIE Proceedings Vol. 4479:
Applications and Science of Neural Networks, Fuzzy Systems, and Evolutionary Computation IV
Bruno Bosacchi; David B. Fogel; James C. Bezdek, Editor(s)

© SPIE. Terms of Use
Back to Top