Share Email Print
cover

Proceedings Paper

Fast vector quantization algorithm preserving color image quality
Author(s): Christophe Charrier; Hocine Cherifi
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

In the color image compression field, it is well known by researchers that the information is statistically redundant. This redundancy is a handicap in terms of dictionary construction time. A way to counterbalance this time consuming effect is to reduce the redundancy within the original image while keeping the image quality. One can extract a random sample of the initial training set on which one constructs the codebook whose quality is equal to the quality of the codebook generated from the entire training set. We applied this idea in the color vector quantization (VQ) compression scheme context. We propose an algorithm to reduce the complexity of the standard LBG technique. We searched for a measure of relevance of each block from the entire training set. Under the assumption that the measure of relevance is a independent random variable, we applied the Kolmogorov statistical test to define the smallest size of a random sample, and then the sample itself. Finally, from blocks associated to each measure of relevance of the random sample, we compute the standard LBG algorithm to construct the codebook. Psychophysics and statistical measures of image quality allow us to find the best measure of relevance to reduce the training set while preserving the image quality and decreasing the computational cost.

Paper Details

Date Published: 1 April 1998
PDF: 12 pages
Proc. SPIE 3308, Very High Resolution and Quality Imaging III, (1 April 1998); doi: 10.1117/12.302431
Show Author Affiliations
Christophe Charrier, Univ. Jean Monnet (Canada)
Hocine Cherifi, Univ. Jean Monnet (France)


Published in SPIE Proceedings Vol. 3308:
Very High Resolution and Quality Imaging III
V. Ralph Algazi; Andrew G. Tescher, Editor(s)

© SPIE. Terms of Use
Back to Top