Share Email Print

Proceedings Paper

Training autoassociative recurrent neural network with preprocessed training data
Author(s): Arun Maskara; Andrew Noetzel
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

The Auto-Associative Recurrent Network (AARN), a modified version of the Simple Recurrent Network (SRN) can be trained to behave as recognizer of a language generated by a regular grammar. The network is trained successfully on an unbounded number of sequences of the language, generated randomly from the Finite State Automation (FSA) of the language. But the training algorithm fails when training is restricted to a fixed finite set of examples. Here, we present a new algorithm for training the AARN from a finite set of language examples. A tree is constructed by preprocessing the training data. The AARN is trained with sequences generated randomly from the tree. The results of the simulations experiments are discussed.

Paper Details

Date Published: 19 August 1993
PDF: 9 pages
Proc. SPIE 1966, Science of Artificial Neural Networks II, (19 August 1993); doi: 10.1117/12.152645
Show Author Affiliations
Arun Maskara, New Jersey Institute of Technology (United States)
Andrew Noetzel, The William Paterson College (United States)

Published in SPIE Proceedings Vol. 1966:
Science of Artificial Neural Networks II
Dennis W. Ruck, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?