Share Email Print

Proceedings Paper

Parallel methods for similar image compression and classification with common models
Author(s): Oleg S. Pianykh; John M. Tyler; Raj Sharman
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

This paper addresses efficient parallel compression and classification for sets of similar images that are normally generated from satellite imagery, medical imaging (CT and MR scans) or aerial surveillance. From our experiments it was observed that image similarities for each class of images can be more efficiently expressed in the domain of image compressing transforms. In particular, the paper shows that only one predictive compressing model can be constructed for the entire class of similar images of the same nature, and then used for nearly optimal compression of any image of the class. The extraction of the optimal class-compressing model still remains a computationally intensive process, which can be considerably improved on parallel computers. The paper demonstrates how a similar database compressing model can be extracted in parallel, and how this can be used for parallel similar database compression and classification of new images into appropriate similarity classes. The results of the parallel similar image analysis are demonstrated with MR and CT brain images obtained from the M.D. Anderson Cancer Center.

Paper Details

Date Published: 21 September 1998
PDF: 7 pages
Proc. SPIE 3452, Parallel and Distributed Methods for Image Processing II, (21 September 1998); doi: 10.1117/12.323464
Show Author Affiliations
Oleg S. Pianykh, Louisiana State Univ. (United States)
John M. Tyler, Louisiana State Univ. (United States)
Raj Sharman, Louisiana State Univ. (United States)

Published in SPIE Proceedings Vol. 3452:
Parallel and Distributed Methods for Image Processing II
Hongchi Shi; Patrick C. Coffield, Editor(s)

© SPIE. Terms of Use
Back to Top