Share Email Print

Proceedings Paper

Computational complexity of object-based image compression
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Image compression via transform coding applied to small rectangular regions or encoding blocks appears to be approaching asymptotic rate-distortion performance. However, an emerging compression technology, called object-based compression (OBC) promises significantly improved performance via compression ratios ranging from 200:1 to as high as 2,500:1. OBC involves segmentation of image regions, followed by efficient encoding of each region's content and boundary. During decompression, such regions can be approximated by objects from a codebook, yielding a reconstructed image that is semantically equivalent to the corresponding source image, but has pixel- and featural-level differences. Semantic equivalence between the source and decompressed image facilitates fast decompression through efficient substitutions, albeit at the cost of codebook search in the compression step. Given small codebooks, OBC holds promise for information-push technologies where approximate context is sufficient, for example, transmission of surveillance images that provide the gist of a scene. However, OBC is not necessarily useful for applications requiring high accuracy, such as medical image processing, because substitution of source content can be inaccurate at small spatial scales. The cost of segmentation is a significant disadvantage in current OBC implementations. Several innovative techniques have been developed for region segmentation, as discussed in a previous paper [4]. Additionally, tradeoffs between representational fidelity, computational cost, and storage requirement occur, as with the vast majority of lossy compression algorithms. This paper analyzes the computational (time) and storage (space) complexities of several recent OBC algorithms applied to single-frame imagery. A time complexity model is proposed, which can be associated theoretically with a space complexity model that we have previously published [2]. The result, when combined with measurements of representational accuracy described in a companion paper [5], supports estimation of a time-space-error bandwidth product that could facilitate dynamic optimization of OBC algorithms. In practice, this would support efficient compression with visually acceptable reconstruction for a wide variety of military and domestic applications.

Paper Details

Date Published: 17 September 2005
PDF: 10 pages
Proc. SPIE 5915, Mathematics of Data/Image Coding, Compression, and Encryption VIII, with Applications, 59150F (17 September 2005); doi: 10.1117/12.613204
Show Author Affiliations
Mark S. Schmalz, Univ. of Florida (United States)
Gerhard X. Ritter, Univ. of Florida (United States)

Published in SPIE Proceedings Vol. 5915:
Mathematics of Data/Image Coding, Compression, and Encryption VIII, with Applications
Mark S. Schmalz, Editor(s)

© SPIE. Terms of Use
Back to Top