Share Email Print

Spie Press Book • new

Multispectral Image Fusion and Colorization
Format Member Price Non-Member Price

Book Description

This book provides a complete overview of the state of the art in color image fusion, the associated evaluation methods, and its range of applications. It presents a comprehensive overview of fusion metrics and a comparison of objective metrics and subjective evaluations. Part I addresses the historical background and basic concepts. Part II describes image fusion theory. Part III focuses on quantitative and qualitative evaluation. Part IV presents several fusion applications, including two primary multiscale fusion approaches--the image pyramid and wavelet transform--as they pertain to face matching, biomedical imaging, and night vision.

Book Details

Date Published: 27 March 2018
Pages: 396
ISBN: 9781510619067
Volume: PM285

Table of Contents
SHOW Table of Contents | HIDE Table of Contents

Table of Contents

1 Motivation
1.1 What Is Image Fusion?
1.2 The Purpose of Image Fusion
1.3 How Image Fusion Works
1.4 Applications
      1.4.1 Face recognition
      1.4.2 Biomedical applications
      1.4.3 Visual inspection
      1.4.4 Multifocus and multiexposure fusion
1.5 Summary
References

PART I: IMAGE FUSION CONCEPTS

2 Introduction
2.1 Image Fusion Survey
      2.1.1 Image fusion categorization
      2.1.2 Multimodal image fusion
2.2 Image Fusion Algorithms
      2.2.1 Multiresolution-analysis-based approach
      2.2.2 Learning-based approach
      2.2.3 Fusion in color space
      2.2.4 Other approaches
      2.2.5 Feature-level fusion
           2.2.5.1 Image segmentations, contours, and silhouettes
           2.2.5.2 Image amplitude, phase, and eigenfeatures
           2.2.5.3 Image statistical features
      2.2.6 Decision-level fusion
2.3 Datasets for Image Fusion
2.4 Summary
References

3 Biological Vision
3.1 Animal Vision with Image Fusion Systems
      3.1.1 Snakes: electro-optical/infrared fusion
      3.1.2 Mantis shrimp: polarization analysis
      3.1.3 Butterflies: ultraviolet vision
3.2 Human Visual System
      3.2.1 Eyes and retina
      3.2.2 Biological image formation
      3.2.3 Binocular vision
      3.2.4 Visual cortex
3.3 Visual Perception
      3.3.1 Gestalt theory
      3.3.2 Visual illusions
      3.3.3 Neural networks
3.4 Application Examples
      3.4.1 Opponent color process for multisensor fusion
      3.4.2 Color-image-enhancement model
      3.4.2.1 Simulation of bipolar and horizontal cells
      3.4.2.2 Simulation of ganglion and amacrine cells
      3.4.2.3 Experiments for model validation
      3.4.3 Pulse-coupled neural network
3.5 Summary
References

4 Operating Conditions
4.1 Layered Sensing
4.2 Introduction to Operating Conditions
      4.2.1 Sensor-based-classifier operating conditions
      4.2.2 Scenario-based evaluation
      4.2.3 Design of experiments for scenarios
4.3 Operating-Condition Modeling Terminology
      4.3.1 Direct versus indirect OCs
      4.3.2 Derived OCs
      4.3.3 Standard versus extended OCs
4.4 Operating-Condition Model Design
      4.4.1 Bayes model
      4.4.2 Bayes model for real-world (scenario) analysis
4.5 Operating-Condition Example
      4.5.1 Target OCs
      4.5.2 Environmental OCs
      4.5.3 Sensor OCs
      4.5.4 OC model
      4.5.5 ATC training OCs
4.6 Case Study 1: Conditioning Based On OCs
4.7 Case Study 2: Multimodal Tracking
4.8 Case Study 3: Image Fusion Tracking over OCs
4.9 Discussion
      4.9.1 Advantages
      4.9.2 Limitations
      4.9.3 Image fusion tracking
4.10 Conclusions
References

PART II: IMAGE FUSION THEORY

5 Image Analysis
5.1 Preprocessing
      5.1.1 Image acquisition
      5.1.2 Image denoising and enhancement
      5.1.3 Image normalization
5.2 Registration
      5.2.1 Reference marks or geometry
      5.2.2 Binary mask
      5.2.3 Phase correlation
      5.2.4 Mutual information
      5.2.5 SIFT or SURF
           5.2.5.1 Scale-space extrema detection
           5.2.5.2 SIFT feature representation
           5.2.5.3 SIFT feature for matching and recognition
           5.2.5.4 SIFT feature for registration
           5.2.5.5 SURF feature
      5.2.6 Using contourlets or bandelets
5.3 Segmentation
      5.3.1 Mammogram segmentation using a circular Gaussian filter
      5.3.2 Multispectral segmentation using clustering and merging
5.4 Feature Extraction
5.5 Classification
      5.5.1 Pattern classification
      5.5.2 Decision making
5.6 Examples
      5.6.1 Multispectral image registration for face recognition
      5.6.2 Medical image preprocessing for cancer detection
      5.6.3 Multispectral image segmentation for colorization
      5.6.4 Facial features
      5.6.5 Comparison of score fusion and decision fusion
5.7 Summary
References

6 Information Fusion Levels
6.1 Architectures
      6.1.1 Multilevel fusion
      6.1.2 Multiresolution fusion
6.2 Pixel (Signal) Level
6.3 Feature Level
6.4 Score Level
      6.4.1 Score normalization and cross-validation
      6.4.2 Binomial logistic regression
      6.4.3 Hidden Markov model for score fusion
6.5 Decision Level
6.6 Examples
      6.6.1 Image fusion
      6.6.2 Image fusion and feature fusion for face recognition
      6.6.3 Score fusion and decision fusion for face recognition
           6.6.3.1 Face dataset and experimental design
           6.6.3.2 Performance of single face recognition algorithm
           6.6.3.3 Performance with score fusion and decision fusion
6.7 Confusion-Matrix Decision-Level Fusion
6.8 Summary
References

7 Image Fusion Methods
7.1 Pyramids
      7.1.1 Laplacian pyramid
      7.1.2 Ratio and contrast pyramid
      7.1.3 Gradient pyramid
      7.1.4 Morphological pyramid
      7.1.5 Symbol illustration
7.2 Wavelets
      7.2.1 Advanced discrete wavelet transform
      7.2.2 Iterative advanced discrete wavelet transform
      7.2.3 Orientation-based fusion
7.3 Image Fusion Applications
      7.3.1 Fusion of color images
      7.3.2 Fusion of multiple images
7.4 Bandelet-Based Fusion
      7.4.1 Introduction to bandelets
      7.4.2 Ridgelet and bandelet methods
           7.4.2.1 Multiresolution analysis
           7.4.2.2 Fourier transform
           7.4.2.3 Wavelet transform
           7.4.2.4 Ridgelet transform
           7.4.2.5 Bandelets
7.5 Contourlet-Based Fusion
      7.5.1 Introduction to contourlets
      7.5.2 Contourlet methods
           7.5.2.1 Curvelet transform
           7.5.2.2 Contourlet transform
      7.5.3 Contourlet applications
7.6 Examples
      7.6.1 Face matching
      7.6.2 Biomedical examples
      7.6.3 Night vision
      7.6.4 Multifocus and multiexposure images
      7.6.5 Bandelet-based-fusion examples
           7.6.5.1 Image fusion evaluation
           7.6.5.2 Image registration and fusion process using bandelets
           7.6.5.3 Bandelet experiment
           7.6.5.4 Bandelet discussion
      7.6.6 Contourlet-based-fusion example
           7.6.6.1 Performance metrics
           7.6.6.2 Multimodal average fusion
           7.6.6.3 Multifocus average fusion
           7.6.6.4 Discussion
7.7 Summary
References

8 Colorization Methods
8.1 Introduction
8.2 Preprocessing and the Color-Space Transform
      8.2.1 Multispectral image preprocessing
      8.2.2 Color-space transform
8.3 Segmentation-Based Colorization Method
      8.3.1 Image segmentation
           8.3.1.1 Nonlinear diffusion
           8.3.1.2 Clustering and region merging
      8.3.2 Segment recognition
      8.3.3 Color mapping and contrast smoothing
      8.3.4 Experimental design
8.4 Channel-Based Color-Fusion Methods
      8.4.1 Color fusion of (II LWIR)
      8.4.2 Color fusion of (NIR LWIR)
8.5 Color-Mapping Colorization Methods
      8.5.1 Statistic matching
      8.5.2 Histogram matching
      8.5.3 Joint histogram matching
      8.5.4 Lookup table
8.6 Examples
      8.6.1 Segmentation-based colorization examples
      8.6.2 Channel-based and color-mapping colorization examples
8.7 Summary
References

PART III: IMAGE FUSION EVALUATION

9 Image Fusion Metrics
9.1 Introduction
9.2 Information-Theory-Based Metrics
      9.2.1 Entropy
      9.2.2 Tsallis entropy
      9.2.3 Nonlinear correlation information entropy
      9.2.4 Normalized mutual information
9.3 Structural-Similarity-Based Metrics
      9.3.1 Image-quality index
      9.3.2 Cvejie's metric
      9.3.3 Yang's metric
9.4 Image-Feature-Based Metrics
      9.4.1 Spatial frequency
      9.4.2 Multiscale-scheme-based metric
9.5 Human-Perception-Inspired Metrics
      9.5.1 Chen-Varshney
      9.5.2 Chen-Blum
9.6 Quantitative Colorization Metrics
      9.6.1 Four component metrics for the OEI
           9.6.1.1 Phase congruency
           9.6.1.2 Gradient magnitude
           9.6.1.3 Image contrast
           9.6.1.4 Color natural
      9.6.2 Objective evaluation index
9.7 Examples
      9.7.1 Using IQ metrics for grayscale image evaluation
      9.7.2 Using OEI for colorized-image evaluation
      9.7.3 Comparative study of fusion metrics
9.8 Summary
References

10 Image Fusion Evaluation
10.1 Combining Approach, Methods, and Metrics
10.2 Qualitative versus Quantitative Evaluation
10.3 Performance-Improvement Measurement
10.4 Condition-Based Evaluation
10.5 Experimental Design and Result Analysis
10.6 Examples
      10.6.1 Qualitative evaluation of grayscale image fusion
           10.6.1.1 Psychophysical experiment design
           10.6.1.2 Evaluation results and discussion
      10.6.2 Qualitative evaluation of night-vision colorization
           10.6.2.1 Experimental design
           10.6.2.2 Evaluation results and analysis
           10.6.2.3 User instruction for subjective evaluations of night-vision colorization
10.7 Statistical Comparison of Image Fusion Algorithms
10.8 Summary
References

PART IV: IMAGE FUSION APPLICATIONS

11 Fusion Applications in Biometrics
11.1 Multispectral Face-Recognition Example
      11.1.1 Literature review
      11.1.2 Overview of the proposed face-recognition system
      11.1.3 Face-recognition algorithms
           11.1.3.1 Circular Gaussian filter
           11.1.3.2 Face pattern byte
      11.1.4 Stereo fusion and multispectral fusion on four levels
           11.1.4.1 Image fusion
           11.1.4.2 Feature fusion
           11.1.4.3 Score fusion
           11.1.4.4 Decision fusion
      11.1.5 Experimental results and discussion
           11.1.5.1 Face dataset and experimental design
           11.1.5.2 Performance evaluation
           11.1.5.3 Single matcher
           11.1.5.4 Four-level fusion
           11.1.5.5 Summary of stereo fusion, multispectral fusion, and discussion
      11.1.6 Summary
11.2 Biometric-Score-Fusion Example
      11.2.1 Literature review
      11.2.2 Score normalization and fusion evaluation
           11.2.2.1 Score normalization
           11.2.2.2 Fusion evaluation
      11.2.3 Score-fusion processes
           11.2.3.1 Arithmetic fusion and classifier fusion
           11.2.3.2 Hidden Markov model for multimodal score fusion
      11.2.4 Experimental results and discussions
           11.2.4.1 Multimodal scores and experimental design
           11.2.4.2 Results and discussion
      11.2.5 Summary
References

12 Additional Fusion Applications
12.1 Iterative-Wavelet-Fusion Example
      12.1.1 Need for image fusion
      12.1.2 Image-quality metrics
      12.1.3 Image fusion methods
      12.1.4 Experimental results and discussion
           12.1.4.1 Experimental results
           12.1.4.2 Discussion
      12.1.5 Summary
12.2 Medical Image Fusion Example
      12.2.1 Significance of the problem
      12.2.2 Image fusion metrics and methods
           12.2.2.1 Image-quality metrics
           12.2.2.2 Iterative fusion methods
      12.2.3 Experimental results and discussion
           12.2.3.1 Experimental design
           12.2.3.2 Results and discussion
      12.2.4 Summary
12.3 Terahertz and Visual Image Fusion
      12.3.1 Properties of terahertz images, and literature review
      12.3.2 Terahertz imaging
           12.3.2.1 Use of terahertz imagery
           12.3.2.2 Terahertz images
      12.3.3 Terahertz challenge problem
           12.3.3.1 NAECON 2011 terahertz challenge problem
           12.3.3.2 Target orientation for a specular reflection
      12.3.4 Image fusion
      12.3.5 Experiment
           12.3.5.1 Edge detection and image fusion with the Canny operator
           12.3.5.2 Edge detection and image fusion with the LoG operator
           12.3.5.3 Edge detection and image fusion with intensity
      12.3.6 Summary
References

13 Summary
13.1 Fusion Methods
           13.1.1 Grayscale image fusion
           13.1.2 Multispectral image colorization
13.2 Fusion Metrics
13.3 Fusion Examples
13.4 Concluding Remarks and Future Trends
References

Appendix: Online Resources

Preface

Over the past two decades, there have been a growing number of image fusion solutions; however, there has not been a comprehensive book from which to teach standard image fusion methods. There are very few books that follow a textbook style that elaborates the entire process, from concepts and theory to evaluation and application. A textbook is especially useful to train beginners.

This book was written to provide readers with an understanding of image fusion techniques with basic principles, common examples, and known methods. Common examples are presented to interest any reader in the fundamentals. Although not all methods are extensively covered, the book aims to provide a students, practitioners, and researchers a background in proven techniques. Undergraduate training in engineering or science is recommended to appreciate concepts such as linear algebra and image processing.

The second motivation for the text was to organize the terminology, results, and techniques. The book and the associated software provide readers the opportunity to explore common image fusion methods, such as how to combine multiband images to enhance computer vision and human vision for applications such as face recognition and scene understanding.

The third motivation was to provide a baseline in performance evaluation of image fusion methods. Most publications concentrate on image fusion methods, although some quality metrics are used for comparison. Very few publications provide a comprehensive overview of fusion metrics and a comparison of objective metrics and subjective evaluations. Throughout this book, examples are shown and an array of metrics are presented that help establish the capabilities of image fusion. Different motivational applications might use some or none of the metrics, but the goal of the book is to start formalizing image fusion evaluation.

This book presents concepts, methods, evaluations, and applications of multispectral image fusion and night-vision colorization organized into four areas: (1) concepts, (2) theory, (3) evaluation, and (4) applications. Two primary multiscale fusion approaches - image pyramid and wavelet transform - are elaborated as applied to several examples, including face matching, biomedical imaging, and night vision. Using these examples, multiple-level fusion is demonstrated for pixel-, feature-, score-, and decision-level fusion. Image fusion comparisons are highlighted, including data, metrics, and analytics. Finally, the book also addresses a topic not highlighted elsewhere: techniques for evaluation, either objectively with computer metrics or subjectively by human users. An appendix includes online resources, including example data and code.

Chapter 1 describes the motivation of performing image fusion. An overview of fusion advantages is presented to endorse the idea of practical uses of image fusion.

Part I includes three chapters to present the background information and basic concepts of image fusion. Chapter 2 briefly surveys the field of image fusion. Chapter 3 discusses image fusion as it exists in biological vision, whereas Chapter 4 addresses certain sensor, object, and environmental operating conditions.

Part II describes image fusion theory in four chapters. Chapter 5 reviews image analysis techniques that form a processing pipeline for pattern-recognition tasks such as image fusion. Chapter 6 covers the information fusion approaches at different levels. Commonly used image fusion methods are described in Chapter 7. Chapter 8 is dedicated to a night-vision colorization technique that uses multispectral images.

Part III consists of two chapters dedicated to quantitative evaluation and qualitative evaluation. There are many publications inventing new image fusion methods. However, image fusion evaluation is needed to determine the comparative advantages of new methods. Quantitative metrics are described in Chapter 9 to objectively evaluate the quality of fused images including both grayscale and colorized imagers. Qualitative evaluation methods are discussed in Chapter 10 to conduct subjective assessments on fused imagery, which is crucial to military actions, medical applications, and colored imagery.

Part IV presents several fusion applications illustrated with off-focal, medical, terahertz, night-vision, and facial images. Chapter 11 concentrates on biometric applications, where a face-recognition example provides a complete illustration of multiple-level fusion. Chapter 12 includes an iterative wavelet-fusion example, a biomedical application of fusing magnetic resonance imaging scans, and terahertz and visual image fusion for concealed-object detection. Chapter 13 presents brief summaries of fusion methods, metrics, and examples.

The idea of writing a book was inspired by a SPIE short course - "Multispectral Image Fusion and Night Vision Colorization" - taught by Zheng and Blasch at the SPIE Defense and Commercial Sensing conference since 2014. Course attendees encouraged the writing of a textbook; furthermore, there was interest in a summary of night-vision-colorization techniques due to the growing needs of commercial operations. The comparison and evaluation of these techniques are unique features of this book.

This is the first comprehensive text for teaching image fusion, and we hope others improve on the techniques to make image fusion methods more common.

Yufeng Zheng
Erik Blasch
Zheng Liu
March 2018


© SPIE. Terms of Use
Back to Top