Share Email Print

Proceedings Paper

Homogeneous And Layered Alternating Projection Neural Networks
Author(s): R. J. Marks II; S. Oh; L. E. Atlas; J. A. Ritcey
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

We consider a class of neural networks whose performance can be analyzed and geometrically visualized in a signal space environment. Alternating projection neural networks (APNN's) perform by alternately projecting between two or more constraint sets. Criteria for desired and unique convergence are easily established. The network can be taught from a training set by viewing each library vector only once. The network can be configured as either a content addressable memory (homogeneous form) or classifier (layered form). The number of patterns that can be stored in the network is on the order of the number of input and hidden neurons. If the output neurons can take on only one of two states, then the trained layered APNN can be easily configured to converge in one iteration. More generally, convergence is at an exponential rate. Convergence can be improved by the use of sigmoid type nonlinearities, network relaxation and/or increasing the number of neurons in the hidden layer. The manner in which the network generalizes can be directly evaluated.

Paper Details

Date Published: 8 February 1989
Proc. SPIE 0960, Real-Time Signal Processing for Industrial Applications, (8 February 1989); doi: 10.1117/12.947804
Show Author Affiliations
R. J. Marks II, University of Washington (United States)
S. Oh, University of Washington (United States)
L. E. Atlas, University of Washington (United States)
J. A. Ritcey, University of Washington (United States)

Published in SPIE Proceedings Vol. 0960:
Real-Time Signal Processing for Industrial Applications
Bahram Javidi, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?