Share Email Print
cover

Proceedings Paper

Improving convergence and performance of Kohonen's self-organizing scheme
Author(s): Nikhil R. Pal; James C. Bezdek; Eric C.K. Tsao
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Kohonen-like clustering algorithms (e.g., learning vector quantization) suffer from several major problems. For this class of algorithms, output often depends on the initialization. If the initial values of the cluster centers are outside the convex hull of the input data, such an algorithm, even if it terminates, may not produce meaningful results in terms of prototypes for clustering. This is because it updates only the winner prototype with every input vector. In this paper we propose a generalization of learning vector quantization (which we shall call a Kohonen clustering network or KCN) which, unlike other methods, updates all the nodes with each input vector. Moreover, the network attempts to find a minimum of a well defined objective function. The learning rules depend on the degree of match to the winner node; the lesser the degree of match with the winner, the more is the impact on nonwinner nodes. Our numerical results show that the generated prototypes do not depend on the initialization, learning coefficient, or the number of iterations (provided KCN runs for at least 200 passes through the data). We use Anderson's IRIS data to illustrate our method; and we compare our results with the standard Kohonen approach.

Paper Details

Date Published: 1 July 1992
PDF: 10 pages
Proc. SPIE 1710, Science of Artificial Neural Networks, (1 July 1992); doi: 10.1117/12.140118
Show Author Affiliations
Nikhil R. Pal, Univ. of West Florida (United States)
James C. Bezdek, Univ. of West Florida (United States)
Eric C.K. Tsao, Univ. of West Florida (United States)


Published in SPIE Proceedings Vol. 1710:
Science of Artificial Neural Networks
Dennis W. Ruck, Editor(s)

© SPIE. Terms of Use
Back to Top