Share Email Print
cover

Proceedings Paper

Simple, fast codebook training algorithm by entropy sequence for vector quantization
Author(s): Chao-yang Pang; Shaowen Yao; Zhang Qi; Shi-xin Sun; Jingde Liu
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

The traditional training algorithm for vector quantization such as the LBG algorithm uses the convergence of distortion sequence as the condition of the end of algorithm. We presented a novel training algorithm for vector quantization in this paper. The convergence of the entropy sequence of each region sequence is employed as the condition of the end of the algorithm. Compared with the famous LBG algorithm, it is simple, fast and easy to be comprehended and controlled. We test the performance of the algorithm by typical test image Lena and Barb. The result shows that the PSNR difference between the algorithm and LBG is less than 0.1dB, but the running time of it is at most one second of LBG.

Paper Details

Date Published: 26 September 2001
PDF: 9 pages
Proc. SPIE 4551, Image Compression and Encryption Technologies, (26 September 2001); doi: 10.1117/12.442923
Show Author Affiliations
Chao-yang Pang, Univ. of Electronic Science and Technology of China (China)
Shaowen Yao, Kunming Univ. of Science and Technology (China)
Zhang Qi, Kunming Institute for Physics (China)
Shi-xin Sun, Univ. of Electronic Science and Technology of China (China)
Jingde Liu, Univ. of Electronic Science and Technology of China (China)


Published in SPIE Proceedings Vol. 4551:
Image Compression and Encryption Technologies

© SPIE. Terms of Use
Back to Top