Share Email Print
cover

Proceedings Paper

Feature space trajectory (FST) classifier neural network
Author(s): Leonard Neiberg; David P. Casasent
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

A new classifier neural network is described for distortion-invariant multi-class pattern recognition. Its input training data in different classes are described by a feature space. As a distortion parameter (such as aspect view) of a training set object is varied, an ordered training set is produced. This ordered training set describes the object as a trajectory in feature space, with different points along the trajectory corresponding to different aspect views. Different object classes are described by different trajectories. Classification involves calculation of the distance from an input feature space point to the nearest trajectory (this denotes the object class) and the position of the nearest point along that trajectory (this denotes the pose of the object). Comparison to other neural networks and other classifiers show that this feature space trajectory neural network yields better classification performance and can reject non-object data. The FST classifier performs well with different numbers of training images and hidden layer neurons and also generalizes better than other classifiers.

Paper Details

Date Published: 10 October 1994
PDF: 17 pages
Proc. SPIE 2353, Intelligent Robots and Computer Vision XIII: Algorithms and Computer Vision, (10 October 1994); doi: 10.1117/12.188901
Show Author Affiliations
Leonard Neiberg, Carnegie Mellon Univ. (United States)
David P. Casasent, Carnegie Mellon Univ. (United States)


Published in SPIE Proceedings Vol. 2353:
Intelligent Robots and Computer Vision XIII: Algorithms and Computer Vision
David P. Casasent, Editor(s)

© SPIE. Terms of Use
Back to Top