Share Email Print
cover

Proceedings Paper

Incorporating content sensitivity into standard video compression algorithms
Author(s): Krishna Ratakonda; Ligang Lu
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Typical implementations of standard video compression algorithms use variance-based perceptual masking to adaptively vary the degree of compression noise to video content; the idea being that the human visual system is more sensitive to errors in low variance regions of the image. On a similar note, quality assessment algorithms employ similar spatial-frequency adaptive measures to judge compressed video quality. Such simplistic analysis leads to problems when we encounter "clean edges" in video (for example, text) which although classified as high frequency areas, are still very sensitive to compression noise. Such discrimination between "good" and "bad" texture is particularly important for low bit rate, internet video transmission. In this paper we will propose a novel, computationally simple scheme which further distinguishes between "clean" and "random" edges. By comparing the proposed algorithm with other standard texture classification schemes in literature, we will show that we obtain a substantial improvement in discriminatory ability, albeit at a much lower computational cost. Further more, we will show that the proposed scheme can be applied on a per macro-block basis, thus making it particularly suitable to drive standard, block based video compression algorithms such as H.261/3 or MPEG-1/2 which are quite popular.

Paper Details

Date Published: 22 March 2001
PDF: 4 pages
Proc. SPIE 4209, Multimedia Systems and Applications III, (22 March 2001); doi: 10.1117/12.420833
Show Author Affiliations
Krishna Ratakonda, IBM Thomas J. Watson Research Ctr. (United States)
Ligang Lu, IBM Thomas J. Watson Research Ctr. (United States)


Published in SPIE Proceedings Vol. 4209:
Multimedia Systems and Applications III
Andrew G. Tescher; Bhaskaran Vasudev; V. Michael Bove, Editor(s)

© SPIE. Terms of Use
Back to Top