Share Email Print
cover

Proceedings Paper

GPUs for data parallel spectral image compression
Author(s): Jarno Mielikainen; Risto Honkanen; Pekka Toivanen; Bormin Huang
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

The amount of data generated by hyper- and ultraspectral imagers is so large that considerable savings in data storage and transmission bandwidth can be achieved using data compression. Due to the large amount of data, the data compression time is of importance. Increasing programmability of commodity Graphics Processing Units (GPUs) allows their usage as General Purpose computation on Graphical Processing Units (GPGPU). GPUs offer potential for considerable increase in computation speed in applications that are data parallel. Data parallel computation on image data executes the same program on many image pixels on parallel. We have implemented a spectral image data compression method called Linear Prediction with Constant Coefficients (LP-CC) using Nvidia's CUDA parallel computing architecture. CUDA is a parallel programming architecture that is designed for data-parallel computation. CUDA hides the GPU hardware from the developers. Moreover, CUDA does not require the programmers to explicitly manage threads. This simplifies the programming model. Our GPU implementation is experimentally compared to the native CPU implementation. Our speed-up factor was over 30 compared to a single threaded CPU version.

Paper Details

Date Published: 31 August 2009
PDF: 8 pages
Proc. SPIE 7455, Satellite Data Compression, Communication, and Processing V, 74550C (31 August 2009); doi: 10.1117/12.828135
Show Author Affiliations
Jarno Mielikainen, Univ. of Kuopio (Finland)
Risto Honkanen, Univ. of Kuopio (Finland)
Pekka Toivanen, Univ. of Kuopio (Finland)
Bormin Huang, Univ. of Wisconsin-Madison (United States)


Published in SPIE Proceedings Vol. 7455:
Satellite Data Compression, Communication, and Processing V
Bormin Huang; Antonio J. Plaza; Raffaele Vitulli, Editor(s)

© SPIE. Terms of Use
Back to Top