Share Email Print

Proceedings Paper

Fault tolerance of neural networks with noisy training sets
Author(s): Jay I. Minnix
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

It is well established that training backpropagation networks with noisy training sets increases the generalization capabilities of the network. Since input set noise is somewhat analogous to faults in the network, networks trained on noisy inputs should exhibit fault tolerance superior to that of similar networks trained on non-noisy inputs. This paper presents results of a study to determine the effect of noisy training sets on fault tolerance. Backpropagation was used to train three sets of networks on 7 X 7 numeral patterns. One set was the control and used noiseless inputs and the other two used two different noisy cases. Several network examples were trained for each of the three cases (no noise, 10% noise, and 20% noise). The noise was injected into each training image uniformly at random, and took the form of toggled (0 to 1 and 1 to 0) pixel values in the binary input images. After learning was complete, the networks were tested for their fault tolerance to stuck-at-1 and stuck-at-0 element faults, as well as weight connection faults. The networks trained on noisy inputs had substantially better fault tolerance than the network trained on noiseless inputs.

Paper Details

Date Published: 1 July 1992
PDF: 9 pages
Proc. SPIE 1710, Science of Artificial Neural Networks, (1 July 1992); doi: 10.1117/12.140101
Show Author Affiliations
Jay I. Minnix, Stanford Telecommunications, Inc. (United States)

Published in SPIE Proceedings Vol. 1710:
Science of Artificial Neural Networks
Dennis W. Ruck, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?