Share Email Print

Proceedings Paper

Optimization of ART network with Boltzmann machine
Author(s): Omid M. Omidvar; Charles L. Wilson
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

Optimization of large neural networks is essential in improving the network speed and generalization power, while at the same time reducing the training error and the network complexity. Boltzmann methods have been used as a statistical method for combinatorial optimization and for the design of learning algorithm. In the networks studied here, the Adaptive Resonance Theory (ART) serves as a connection creation operator and the Boltzmann method serves as a competitive connection annihilation operator. By combining these two methods it is possible to generate small networks that have similar testing and training accuracy and good generalization from small training sets. Our findings demonstrate that for a character recognition problem the number of weights in a fully connected network can be reduced by over 80%. We have applied the Boltzmann criteria to differential pruning of the connections which is based on the weight contents rather than on the number of connections.

Paper Details

Date Published: 19 August 1993
PDF: 11 pages
Proc. SPIE 1966, Science of Artificial Neural Networks II, (19 August 1993); doi: 10.1117/12.152650
Show Author Affiliations
Omid M. Omidvar, Univ. of the District of Columbia (United States)
Charles L. Wilson, National Institute of Standards and Technology (United States)

Published in SPIE Proceedings Vol. 1966:
Science of Artificial Neural Networks II
Dennis W. Ruck, Editor(s)

© SPIE. Terms of Use
Back to Top