Share Email Print

Proceedings Paper

Optimal learning capability assessment of multicategory neural nets
Author(s): Leda Villalobos; Francis L. Merat
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

In this paper, it is shown that supervised learning can be posed as an optimization problem in which inequality constraints are used to encode the information contained in the training patterns and to specify the degree of accuracy expected from the neural network. Starting from this point, a technique for evaluating the learning capability and optimizing the feature space of a class of higher-order neural networks is developed. The technique gives significant insight into the problem of task learning. It permits establishing whether the structure of the network can effectively learn the training patterns. Should the structure not be appropriate for learning, it indicates which patterns form the minimum set of patterns which cannot be learned with the desired accuracy. Otherwise, it provides a connectivity which produces satisfactory network performance. Furthermore, it identifies those features which can be suppressed from the definition of the feature space without deteriorating network performance. Several examples are presented and results are discussed.

Paper Details

Date Published: 19 August 1993
PDF: 12 pages
Proc. SPIE 1966, Science of Artificial Neural Networks II, (19 August 1993); doi: 10.1117/12.152637
Show Author Affiliations
Leda Villalobos, Case Western Reserve Univ. (United States)
Francis L. Merat, Case Western Reserve Univ. (United States)

Published in SPIE Proceedings Vol. 1966:
Science of Artificial Neural Networks II
Dennis W. Ruck, Editor(s)

© SPIE. Terms of Use
Back to Top