Share Email Print
cover

Proceedings Paper

Prestructuring neural networks via extended dependency analysis with application to pattern classification
Author(s): George G. Lendaris; Thaddeus T. Shannon; Martin Zwick
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

We consider the problem of matching domain-specific statistical structure to neural-network (NN) architecture. In past work we have considered this problem in the function approximation context; here we consider the pattern classification context. General Systems Methodology tools for finding problem-domain structure suffer exponential scaling of computation with respect to the number of variables considered. Therefore we introduce the use of Extended Dependency Analysis (EDA), which scales only polynomially in the number of variables, for the desired analysis. Based on EDA, we demonstrate a number of NN pre-structuring techniques applicable for building neural classifiers. An example is provided in which EDA results in significant dimension reduction of the input space, as well as capability for direct design of an NN classifier.

Paper Details

Date Published: 22 March 1999
PDF: 12 pages
Proc. SPIE 3722, Applications and Science of Computational Intelligence II, (22 March 1999); doi: 10.1117/12.342895
Show Author Affiliations
George G. Lendaris, Portland State Univ. (United States)
Thaddeus T. Shannon, Portland State Univ. (United States)
Martin Zwick, Portland State Univ. (United States)


Published in SPIE Proceedings Vol. 3722:
Applications and Science of Computational Intelligence II
Kevin L. Priddy; Paul E. Keller; David B. Fogel; James C. Bezdek, Editor(s)

© SPIE. Terms of Use
Back to Top