Share Email Print
cover

Proceedings Paper

Learning of dynamic variations of N-dimension patterns in a noniterative neural network
Author(s): Chia-Lun John Hu; Sirikahlaya Chanekasit
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

In a set of preprocessed N-dimension, analog pattern vectors {Um, m=1 to M, each Um represents a distinct pattern}, if N>>M, then a one-layered sign-function neural network (OLNN) is sufficient to do a very robust, yet very accurate, noniterative-learning of all patterns. After the learning is done, the OLNN will make an accurate identification on an untrained test pattern even when the test pattern is varying within a certain dynamic range of a particular standard pattern learned during the noniterative learning process. The analytical foundation for making this dynamic neural network pattern recognition possible is the following. If we know that a standard pattern Um will vary gradually among K boundary patterns Um1 to Umk, then we can train the neural network noniteratively to learn JUST THE BOUNDARY vectors {Umi, i=1 to k} for each pattern Um. Then, due to a distinctive property of noniterative learning, for a test input pattern Ut equal to any graduate changes within the boundaries (i.e., Ut = any CONVEX combination of the boundary set {Umi, i=1 to k, m fixed.}), the OLNN can still automatically recognize this changed pattern even though all these gradually changed pattern are NOT learned step by step in the noniterative learning.

Paper Details

Date Published: 28 March 2005
PDF: 8 pages
Proc. SPIE 5816, Optical Pattern Recognition XVI, (28 March 2005); doi: 10.1117/12.602946
Show Author Affiliations
Chia-Lun John Hu, Southern Illinois Univ./Carbondale (United States)
Sirikahlaya Chanekasit, Southern Illinois Univ./Carbondale (United States)


Published in SPIE Proceedings Vol. 5816:
Optical Pattern Recognition XVI
David P. Casasent; Tien-Hsin Chao, Editor(s)

© SPIE. Terms of Use
Back to Top