Share Email Print
cover

Proceedings Paper

Feature space trajectory representation for active vision
Author(s): Michael A. Sipe; David P. Casasent; Leonard Neiberg
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

A new feature space trajectory (FST) description of 3D distorted views of an object is advanced for active vision applications. In an FST, different distorted object views are vertices in feature space. A new eigen-feature space and Fourier transform features are used. Vertices for different adjacent distorted views are connected by straight lines so that an FST is created as the viewpoint changes. Each different object is represented by a distinct FST. An object to be recognized is represented as a point in feature space; the closest FST denotes the class of the object, and the closest line segment on the FST indicates its pose. A new neural network is used to efficiently calculated distances. We discuss its uses in active vision. Apart from an initial estimate of object class and pose, the FST processor can specify where to move the sensor to: confirm close and pose, to grasp the object, or to focus on a specific object part for assembly or inspection. We advanced initial remarks on the number of aspect views needed and which aspect views are needed to represent an object.

Paper Details

Date Published: 4 April 1997
PDF: 12 pages
Proc. SPIE 3077, Applications and Science of Artificial Neural Networks III, (4 April 1997); doi: 10.1117/12.271506
Show Author Affiliations
Michael A. Sipe, Carnegie Mellon Univ. (United States)
David P. Casasent, Carnegie Mellon Univ. (United States)
Leonard Neiberg, Carnegie Mellon Univ. (United States)


Published in SPIE Proceedings Vol. 3077:
Applications and Science of Artificial Neural Networks III
Steven K. Rogers, Editor(s)

© SPIE. Terms of Use
Back to Top