Share Email Print
cover

Proceedings Paper

Multiresolution training of Kohonen neural networks
Author(s): Dan E. Tamir
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

This paper analyses a trade-off between convergence rate and distortion obtained through a multi-resolution training of a Kohonen Competitive Neural Network. Empirical results show that a multi-resolution approach can improve the training stage of several unsupervised pattern classification algorithms including K-means clustering, LBG vector quantization, and competitive neural networks. While, previous research concentrated on convergence rate of on-line unsupervised training. New results, reported in this paper, show that the multi-resolution approach can be used to improve training quality (measured as a derivative of the rate distortion function) on the account of convergence speed. The probability of achieving a desired point in the quality/convergence-rate space of Kohonen Competitive Neural Networks (KCNN) is evaluated using a detailed Monte Carlo set of experiments. It is shown that multi-resolution can reduce the distortion by a factor of 1.5 to 6 while maintaining the convergence rate of traditional KCNN. Alternatively, the convergence rate can be improved without loss of quality. The experiments include a controlled set of synthetic data, as well as, image data. Experimental results are reported and evaluated.

Paper Details

Date Published: 17 September 2007
PDF: 10 pages
Proc. SPIE 6700, Mathematics of Data/Image Pattern Recognition, Compression, Coding, and Encryption X, with Applications, 67000B (17 September 2007); doi: 10.1117/12.735394
Show Author Affiliations
Dan E. Tamir, Texas State Univ., San Marcos (United States)


Published in SPIE Proceedings Vol. 6700:
Mathematics of Data/Image Pattern Recognition, Compression, Coding, and Encryption X, with Applications
Gerhard X. Ritter; Mark S. Schmalz; Junior Barrera; Jaakko T. Astola, Editor(s)

© SPIE. Terms of Use
Back to Top