Share Email Print
cover

Proceedings Paper

Optimal multidimensional quantization for pattern recognition
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

In non-parametric pattern recognition, the probability density function is approximated by means of many parameters, each one for a density value in a small hyper-rectangular volume of the space. The hyper-rectangles are determined by appropriately quantizing the range of each variable. Optimal quantization determines a compact and efficient representation of the probability density of data by optimizing a global quantizer performance measure. The measure used here is a weighted combination of average log likelihood, entropy and correct classification probability. In multi-dimensions, we study a grid based quantization technique. Smoothing is an important aspect of optimal quantization because it affects the generalization ability of the quantized density estimates. We use a fast generalized k nearest neighbor smoothing algorithm. We illustrate the effectiveness of optimal quantization on a set of not very well separated Gaussian mixture models, as compared to the expectation maximization (EM) algorithm. Optimal quantization produces better results than the EM algorithm. The reason is that the convergence of the EM algorithm to the true parameters for not well separated mixture models can be extremely slow.

Paper Details

Date Published: 31 July 2002
PDF: 16 pages
Proc. SPIE 4875, Second International Conference on Image and Graphics, (31 July 2002); doi: 10.1117/12.477142
Show Author Affiliations
Mingzhou Song, CUNY/Queens College (United States)
Robert M. Haralick, CUNY/Graduate Ctr. (United States)


Published in SPIE Proceedings Vol. 4875:
Second International Conference on Image and Graphics
Wei Sui, Editor(s)

© SPIE. Terms of Use
Back to Top