Share Email Print
cover

Proceedings Paper

Active vision and image/video understanding systems built upon network-symbolic models for perception-based navigation of mobile robots in real-world environments
Author(s): Gary Kuvich
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

To be completely successful, robots need to have reliable perceptual systems that are similar to human vision. It is hard to use geometric operations for processing of natural images. Instead, the brain builds a relational network-symbolic structure of visual scene, using different clues to set up the relational order of surfaces and objects with respect to the observer and to each other. Feature, symbol, and predicate are equivalent in the biologically inspired Network-Symbolic systems. A linking mechanism binds these features/symbols into coherent structures, and image converts from a “raster” into a “vector” representation. View-based object recognition is a hard problem for traditional algorithms that directly match a primary view of an object to a model. In Network-Symbolic Models, the derived structure, not the primary view, is a subject for recognition. Such recognition is not affected by local changes and appearances of the object as seen from a set of similar views. Once built, the model of visual scene changes slower then local information in the visual buffer. It allows for disambiguating visual information and effective control of actions and navigation via incremental relational changes in visual buffer. Network-Symbolic models can be seamlessly integrated into the NIST 4D/RCS architecture and better interpret images/video for situation awareness, target recognition, navigation and actions.

Paper Details

Date Published: 29 December 2004
PDF: 15 pages
Proc. SPIE 5609, Mobile Robots XVII, (29 December 2004); doi: 10.1117/12.577747
Show Author Affiliations
Gary Kuvich, Smart Computer Vision Systems (United States)


Published in SPIE Proceedings Vol. 5609:
Mobile Robots XVII
Douglas W. Gage, Editor(s)

© SPIE. Terms of Use
Back to Top