Share Email Print

Proceedings Paper

Compressed data organization for high throughput parallel entropy coding
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

The difficulty of parallelizing entropy coding is increasingly limiting the data throughputs achievable in media compression. In this work we analyze what are the fundamental limitations, using finite-state-machine models for identifying the best manner of separating tasks that can be processed independently, while minimizing compression losses. This analysis confirms previous works showing that effective parallelization is feasible only if the compressed data is organized in a proper way, which is quite different from conventional formats. The proposed new formats exploit the fact that optimal compression is not affected by the arrangement of coded bits, but it goes further in exploiting the decreasing cost of data processing and memory. Additional advantages include the ability to use, within this framework, increasingly more complex data modeling techniques, and the freedom to mix different types of coding. We confirm the parallelization effectiveness using coding simulations that run on multi-core processors, and show how throughput scales with the number of cores, and analyze the additional bit-rate overhead.

Paper Details

Date Published: 22 September 2015
PDF: 9 pages
Proc. SPIE 9599, Applications of Digital Image Processing XXXVIII, 95991K (22 September 2015); doi: 10.1117/12.2191624
Show Author Affiliations
Amir Said, Qualcomm Technologies, Inc. (United States)
Abo-Talib Mahfoodh, Michigan State Univ. (United States)
Sehoon Yea, LG Electronics Inc. (Korea, Republic of)

Published in SPIE Proceedings Vol. 9599:
Applications of Digital Image Processing XXXVIII
Andrew G. Tescher, Editor(s)

© SPIE. Terms of Use
Back to Top