Share Email Print
cover

Journal of Applied Remote Sensing

Intensity–hue–saturation-based image fusion using iterative linear regression
Author(s): Mufit Cetin; Abdulkadir Tepecik
Format Member Price Non-Member Price
PDF $20.00 $25.00

Paper Abstract

The image fusion process basically produces a high-resolution image by combining the superior features of a low-resolution spatial image and a high-resolution panchromatic image. Despite its common usage due to its fast computing capability and high sharpening ability, the intensity–hue–saturation (IHS) fusion method may cause some color distortions, especially when a large number of gray value differences exist among the images to be combined. This paper proposes a spatially adaptive IHS (SA-IHS) technique to avoid these distortions by automatically adjusting the exact spatial information to be injected into the multispectral image during the fusion process. The SA-IHS method essentially suppresses the effects of those pixels that cause the spectral distortions by assigning weaker weights to them and avoiding a large number of redundancies on the fused image. The experimental database consists of IKONOS images, and the experimental results both visually and statistically prove the enhancement of the proposed algorithm when compared with the several other IHS-like methods such as IHS, generalized IHS, fast IHS, and generalized adaptive IHS.

Paper Details

Date Published: 22 November 2016
PDF: 15 pages
J. Appl. Remote Sens. 10(4) 045019 doi: 10.1117/1.JRS.10.045019
Published in: Journal of Applied Remote Sensing Volume 10, Issue 4
Show Author Affiliations
Mufit Cetin, Yalova Univ. (Turkey)
Abdulkadir Tepecik, Yalova Univ. (Turkey)


© SPIE. Terms of Use
Back to Top