Share Email Print

Proceedings Paper

Contribution of the number of elements in the hidden layer in a back-propagation network to overall decision accuracy
Author(s): Gary M. Jackson
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

The back-propagation network (BPN) has a minimum of one hidden layer of processing elements between the input and output layers. The addition of a hidden layer, or layers, along with the generalized delta rule, are responsible for the BPN exceeding the linear restrictiveness of the earlier Perceptron. Although the major significance of the hidden layer has been well-established, there is no general agreement on a method for determining the number of hidden elements to use for a given data set. One avenue to increasing the window of choosing the optimal number of elements in the hidden layer is to better understand how the number of hidden elements contributed to decision accuracy. In the present research, a single hidden layer BPN was trained using Anderson's classic IRIS data set and tested with a 10-fold validation method across separate studies. While holding all BPN parameters constant, 19 separate tests were conducted beginning with two hidden elements and increasing to 20 hidden elements.

Paper Details

Date Published: 1 February 1994
PDF: 13 pages
Proc. SPIE 2093, Substance Identification Analytics, (1 February 1994); doi: 10.1117/12.172506
Show Author Affiliations
Gary M. Jackson, Consultant to U.S. Government (United States)

Published in SPIE Proceedings Vol. 2093:
Substance Identification Analytics
James L. Flanagan; Richard J. Mammone; Albert E. Brandenstein; Edward Roy Pike M.D.; Stelios C. A. Thomopoulos; Marie-Paule Boyer; H. K. Huang; Osman M. Ratib, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?