Share Email Print
cover

Proceedings Paper

Ortho-ordent initialization of feedforward artificial neural networks (FFANNs) to improve their generalization ability
Author(s): Dilip Sarkar
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

There are several models for neurons and their interconnections. Among them, feedforward artificial neural networks (FFANNs) are very popular for being quite simple. However, to make them truly reliable and smart information processing systems, such characteristics as learning speed, local minima, and generalization ability need more understanding. Difficulties such as long learning-time and local minima, may not affect them as much as the question of generalization ability, because in many applications a network needs only one training, and then it may be used for a long time. However, the question of generalization ability of ANNs is of great interest for both theoretical understanding and practical use, because generalization ability is a measure of a learning system that indicates how closely its actual output approximates to the desired output for an input that it has never seen. We investigate novel techniques for systematic initializations (as opposed to purely random initializations) of FFANN architectures for possible improvement of their generalization ability. Our preliminary work has successfully employed row-vectors of Hadamard matrices to generate initializations; this initialization method has produced networks with better generalization ability.

Paper Details

Date Published: 6 April 1995
PDF: 12 pages
Proc. SPIE 2492, Applications and Science of Artificial Neural Networks, (6 April 1995); doi: 10.1117/12.205191
Show Author Affiliations
Dilip Sarkar, Univ. of Miami (United States)


Published in SPIE Proceedings Vol. 2492:
Applications and Science of Artificial Neural Networks
Steven K. Rogers; Dennis W. Ruck, Editor(s)

© SPIE. Terms of Use
Back to Top