Share Email Print
cover

Spie Press Book

A System Engineering Approach to Imaging
Author(s): Norman S. Kopeika
Format Member Price Non-Member Price

Book Description

This textbook addresses imaging from the system engineering point of view, examining advantages and disadvantages of imaging in various spectral regions. Focuses on imaging principles and system concepts, rather than devices. Intended as a senior-year undergraduate or graduate level engineering textbook. A solution manual is included.

Book Details

Date Published: 9 March 1998
Pages: 704
ISBN: 9780819423771
Volume: PM38
Errata

Table of Contents
SHOW Table of Contents | HIDE Table of Contents
Table of Contents
Introduction
Part 1 Geometrical Optics 1.1
Chapter 1 Electromagnetic Waves and Rays 1.2
1.1 The Nature of Light 1.2
1.2 Maxwell's Equations 1.4
1.3 The Wave Equation 1.5
1.3.1 Plane Wave Propagation 1.6
1.3.2 Propagation Channel 1.10
1.3.3 Polarization 1.17
1.3.4 Spherical Wave Propagation 1.19
1.4 Interfaces 1.20
1.4.1 Angles of Reflection and Refraction 1.24
1.4.2 Reflection and Transmission Coefficients (Fields) 1.26
1.4.3 Reflection and Transmission Coefficients 1.27 (Power)
1.4.4 Brewster and Critical Angles 1.32
1.5 Interfaces in Series 1.35
1.6 The Eikonal Equation 1.42
1.7 The Ray Equation 1.47
References 1.52
Exercises 1.53
Chapter 2 Imaging 2.1
2.1 Fermat's Principle 2.1
2.1.1 Application to Imaging: Thin Lens Formula 2.1 and Magnification
2.2 Paraxial Rays 2.7
2.3 The Thin Lens 2.11
2.3.1 The Shape of a Thin Lens Surface for 2.12 Perfect Focusing
2.3.2 Phase Implications 2.15
2.3.3 Spherical Surfaces 2.17
2.4 Description of a Thin Lens 2.19
2.5 Physical Explanation 2.28
2.6 Image Formulation by a Thin Lens via Ray Tracing 2.31
2.6.1 The Thin Lens Formula 2.33
2.6.2 Lateral Magnification 2.31
2.6.3 Longitudinal or Axial Magnification 2.35
2.7 Two-Lens Combination 2.37
2.8 Thick Lens 2.41
2.9 Virtual Images and Angular Magnification 2.44
2.9.1 Angular Magnification 2.46
2.10 Selfoc Lenses 2.49
2.11 Mirrors 2.51
2.12 Transmitting and Receiving Optics 2.52
2.13 Matrix Optics 2.54
2.14 Aberrations 2.62
References 2.63
Exercises 2.64
Part 2 Image Brightness and Signal-to-Noise Ratio
Chapter 3 Radiometry and Photometry 3.1
3.1 Fundamental Radiometric Quantities 3.1
3.1.1 Geometrical Relations 3.2
3.1.2 Spectral Radiometric Quantities 3.6
3.2 The Fundamental Radiometric Equation 3.8
3.2.1 Specular and Diffuse Surfaces 3.9
3.2.2 Radiant Power from a Lambertian Source 3.11
3.3 The Radiance Theorem 3.17
3.4 Irradiance of Image 3.19
3.5 Radiometry in an Attenuating Medium 3.24
3.6 Reflection Coefficients 3.26
3.7 Fundamental Photometric Quantities 3.27
References 3.31
Exercises 3.32
Chapter 4 Sources of Optical Radiation 4.1
4.1 Blackbody Radiation 4.1
4.2 Planck's Law 4.5
4.3 Natural Thermal Sources 4.12
4.4 Artificial Sources 4.19
4.5 The Laser 4.19
4.5.1 Laser Spectra 4.23
4.5.2 Resonator Stability 4.27
4.5.3 Gaussian Irradiance 4.29
4.6 Coherence Length 4.35
References 4.39
Exercises 4.40
Chapter 5 Noise 5.1
5.1 Quantum Noise 5.1
5.1.1 Generation-Recombination Noise 5.7
5.1.2 Photon-Limited Noise 5.8
5.2 Johnson Noise 5.11
5.3 1/f Noise 5.17
5.4 Electronic signal-to-Noise Ratio: Detection as Imaging 5.17
5.4.1 Detection 5.17
5.4.2 Imaging 5.18
References 5.19
Exercises 5.19
Chapter 6 Detector Concepts and Fundamentals 6.1
6.1 Detection Schemes 6.1
6.1.1 Direct Detection 6.1
6.1.2 Heterodyne Detection 6.7
6.1.3 Advantages of Heterodyne Detection 6.11
6.1.4 Disadvantages of Heterodyne Detection 6.11
6.1.5 Antenna Properties of a Coherent Receiver 6.15
6.1.6 Effects of Integration on Signal-to-Noise Ratio 6.17
6.2 Detectors 6.18
6.2.1 Quantum Detectors 6.22
6.2.2.1 Electron Tubes 6.22
6.2.2.2 Semiconductor Detectors 6.29 6.2.2 Thermal Detectors 6.40
References 6.42
Exercises 6.45
Part 3 The Optical Transfer Function 7.i
Chapter 7 Diffraction 7.1
7.1 The Fundamental Equation of Scalar Diffraction 7.1
7.2 Fraunhofer and Fresnel Diffraction 7.12
7.3 Diffraction Through a Lens 7.18
7.4 Fourier Transforms in Polar Coordinates 7.25
7.5 Array Theorem 7.32
7.6 Optical Data Processing 7.38
7.7 Vander Lugt Filter 7.45
7.8 Holography 7.52
7.9 Coherence 7.56
7.9.1 The Van Cittert-Zernike Theorem 7.62
7.9.2 Coherence Diameter 7.64
7.9.3 Measurement of Diameters of Extended Sources 7.67
References 7.70
Exercises 7.71
Chapter 8 Diffraction-Limited Imaging 8.1
8.1 Aperture Size: Impulse Response 8.1
8.2 Optical Transfer Function for Incoherent Imaging 8.7
8.3 Optical Transfer Function for Diffraction-Limited Imaging 8.8
8.4 Slit Aperture 8.11
8.5 Circular Aperture 8.13
8.6 Coherence and Linearity 8.17
8.6.1 Incoherent Illumination 8.19
8.6.2 Coherent Illumination 8.20
8.7 Resolution of Two Points - Coherent vs Incoherent 8.20
Imaging
8.8 Image Quality in the Spatial Frequency Domain 8.27
8.9 System Properties of Spread Functions and OTFs 8.32
8.10 Cascade Properties of Optical Transfer Functions 8.33
References 8.33
Exercises 8.34
Chapter 9 Modulation Contrast Function 9.1
9.1 Introduction 9.1
9.2 Modulation Contrast 9.2
9.3 Modulation Contrast Function - Sine Wave Response 9.5
9.4 Modulation Contrast Function - Square Wave Response 9.7
9.5 Contrast Transfer 9.10
9.6 Object Plane Modulation Contrast 9.11
9.7 Other Methods to Measure OTF and MTF 9.18
9.7.1. Point Spread Function - Dependence on 9.18
Pixel Size
9.7.2 Edge Response 9.20
References 9.21
Exercises 9.22
Chapter 10 Contrast-Limited Resolution and 10.1
Target Acquisition
10.1 Spatial Frequency Bandwidth 10.1
10.2 Limiting Resolution Required Threshold Contrast 10.1
10.3 MTFA and NE 10.5
10.4 The Johnson Chart 10.6
10.5 Target Acquisition Probabilities 10.11
10.6 Navy Model 10.16
10.7 Textual Resolution 10.17
10.8 Examples of Contrast-Limited Resolution 10.17
10.9 Path Luminance 10.22
References 10.26
Exercises 10.27
Chapter 11 Noise-Limited Imaging and 11.1
Target Acquisition
11.1 Current signal-to-Noise Ratio Deriving from 11.2
Received Irradiance
11.2 Thermal Imaging Systems 11.9
References 11.15
Exercises 11.16
Chapter 12 Human Visual System MTF 12.1
and Threshold Contrast
12.1 Physiology of the Eye 12.2
12.1.2 Detection of Quanta in Photoreceptors 12.6
12.1.3 Dark Adaptation 12.10
12.2 The Spectral Response of the Human Visual System 12.11
12.3 Brightness Constancy 12.12
12.4 Visual Acuity 12.13
12.5 MTF of the Human Visual System 12.16
12.6 Threshold Contrast Curve 12.20
12.7 Moving Images 12.23
12.8 Nearsightedness and Farsightedness 12.24
References 12.25
Exercises 12.26
Chapter 13 Imaging Devices 13.1
13.1 Introduction 13.1
13.2 Semiconductor Image Sensors 13.4
13.2.1 Noise 13.6
13.2.2 Dynamic Range and Gray Scale 13.6
13.2.3 MTF 13.7
13.3.4 CCDs for the Ultraviolet 13.10
13.3.5 Thermal Imaging Focal Plane Arrays 13.10
References 13.11
Exercises 13.13
Part 4 and Chapter 14
Optical Transfer Functions for Image Motion and Vibration
14.1 Introduction 14.1
14.2 General Method of OTF Calculation 14.3
14.3 Linear Motion OTF 14.4
14.4 Sinusoidal Motion OTF 14.10
14.4.1 High Frequency Vibrations 14.11
14.4.2 Low Frequency Vibrations 14.14
14.5 Quadratic Motion 14.23
14.5.1 Analytic Approach 14.23
14.5.2 Numerical Calculation 14.26
14.5.2.1 Constant Blur Radius 14.29
14.5.2.2 Constant Time Exposure 14.30
14.6 Target Acquisition in the Presence of Vibrations 14.31
14.7 Conclusions 14.34
References 14.34
Exercises 14.36
Part 5 Imaging Through the Atmosphere
Chapter 15 Optical Properties of the Atmosphere
15.1 Introduction 15.1
15.2 Absorption 15.3
15.3 Scattering 15.5
15.4 Turbulence 15.12
15.4.1 Fluctuations in Refractive Index - 15.13 Power Spectral Density
15.4.2 Fluctuations in Refractive Index - 15.16 Structure Function
15.4.3 Path-Integrated Measurements of Cn2 15.19
15.4.4 Environmental Effects on Cn2 15.24
15.4.5 Prediction of Cn2 15.24
15.5 Conclusions 15.27
References 15.29
Chapter 16 Turbulence MTF
16.1 Long and Short Exposures 16.1
16.2 Effect of Focal Length 16.3
16.3 Wavelengh Dependence 16.9
16.4 Vertical or Slant Path Viewing 16.10
16.5 Imaging Horizontally 16.13
16.6 Effective Aperture Size 16.13
16.7 Techniques to Correct for Turbulence 16.18
References 16.19
Exercises 16.20
Chapter 17 Aerosol MTF
17.1 Introduction 17.1
17.2 Practical Instrumentation-Based Aerosol MTF 17.6
17.3 Exposure Time 17.12
17.4 Effective Aperture Size 17.14
17.5 Horizontal and Vertical Imaging 17.19
17.6 Wavelength Dependence 17.19
17.7 Imaging Through the Atmosphere - Comparison of 17.21
Turbulence to Aerosol MTFs
17.8 Techniques for Correction of Aerosol MTF Blur 17.25
17.8.1 Active Imaging 17.25
References 17.26
Exercises 17.29
Part 6 Contrast-Limited Imaging
Chapter 18 Image Restoration
18.1 Introduction 18.1
18.2 Inverse Filters 18.1
18.3 Least Squares or Wiener Filters 18.3
18.4 Constrained Least Squares (CLS) Filters 18.5
18.5 Improved Wiener Filter for Atmospheric Degradation 18.9
18.5.1 Fractal Model 18.11
18.5.2 Scene Representation by a 18.11
Wide Sense Markovian Model
18.5.3 Improved Atmospheric Wiener Filter 18.14
18.6 Maximum Entropy Restoration 18.20
18.7 Modified Backus-Gilbert Filter 18.20
18.8 Conclusions 18.24
References 18.28
Chapter 19 Effects of Atmospheric Blur and Image
Restoration on Target Acquisition
19.1 Introduction 19.1
19.2 Contrast-Limited Target Acquisition 19.2
19.2.1 Angular Spatial Frequency Approach 19.2
19.2.3 Angular Magnification Model 19.3
19.2.4 Atmospheric MTF 19.5
19.2.6 Angular Magnification Results 19.10
19.2.7 Image Restoration 19.15
19.2.8 Conclusions 19.19
19.3 Atmospheric Effects on Noise-Limited Target Acquisition 19.20
19.3.1 Examples 19.23
19.3.2 Conclusions 19.28
References 19.29
Appendix
Calculating Practical Aerosol MTF
Solution Manual
Chapter 1
Chapter 2
Chapter 3
Chapter 4
Chapter 5
Chapter 6
Chapter 7
Chapter 8
Chapter 9
Chapter 10
Chapter 11
Chapter 12
Chapter 13
Chapter 14
Chapter 16
Chapter 17

INTRODUCTION

The science of imaging by human beings begins with the human visual system and thus dates back to the initial existence of mankind. The human visual system is limited to the "visible" spectrum, which extends from violet wavelengths on the order of 400 nm (750 THz frequency) to red wavelengths on the order of 700 nanometers (429 THz frequency). However, over the last century or so, mankind has succeeded in extending his imaging capability to beyond the visible spectrum towards both shorter (ultraviolet and x-ray images, for example) and longer wavelengths (infrared, millimeter wave, and microwave wavelengths, for example). A plot of the electromagnetic (EM) spectrum is shown in Fig. I-1. The visible spectrum is defined by the spectal response limits of the human visual system, which peaks at green wavelengths - usually the color most common in areas of human habitation. The visible is also characterized by very little atmospheric absorption of radiation, albeit with lots of light scattering in the atmosphere thus causing "the sky" to exist. Another characteristic of the visible is the transparency of most types of glass, thus making it suitable to fashion from them lenses and other optical components intended to transmit light.

If one branches out from the visible, at the shorter wavelength end is the ultraviolet (UV). The near UV extends from the visible down to about 200 nm wavelength. Below that is the "far" UV also known as the "vacuum" UV because the extremely strong atmospheric absorption of such radiation permits radiation transmission essentially under vacuum conditions only. Most of this absorption is by oxygen and ozone, which is high up in the atmosphere. This absorption is essential for human existence down below, for vacuum UV and shorter-wavelength radiation is ionizing, i.e., the photon energies are so high that they cause permanent changes in media upon which they are incident because they give rise to ionization in them. Hence, the danger implied in the recent discovery of the ozone hole over Antarctica. In the near UV, only special UV- transmitting glasses can be used in optical components. Most ordinary types of glass become opaque at wavelengths on the order of 350 nm or below.

If one proceeds from the visible towards the infrared (IR), nature behaves almost as in the visible except for the human visual system. Glasses that are transparent in the visible are also transparent in the near IR out to close to 3um wavelength. Therefore, lenses and other optical components useful in the visible are also generally quite applicable in the near IR. From about 3 um wavelength out to about 14 um wavelength is the middle or thermal infrared. This wavelength region is most appropriate for thermal imaging. Special materials that are transparent to such radiation are required for lenses and optical components.

Wavelengths from about 15 um to about 100 m comprise the far IR. Here, there is essentially negligible atmospheric transmission because of strong atmospheric absorption properties. The spectral region from about 100 um wavelength to 1 mm is known as the near millimeter wave, or submillimeter wave, region. Strong atmospheric absorption characteristics of the far IR extend down to about 1 mm wavelength, and much of it is caused by water vapor absorption. Millimeter waves (from about 1 mm to 10 mm wavelength) form a bridge between optical and microwaves. Techniques and concepts from both of the latter are readily employed for millimeter waves. Atmospheric transmission at these and longer wavelengths is generally much better than at optical wavelengths.

In general, the longer the wavelength the poorer the resolution or detail in image quality. This limitation stems from diffraction effects discussed in Part III. This poor resolution can only be compensated for by increasing antenna aperture according to increase in wavelength. This is extremely difficult to do at wavelengths as long as microwaves because optical-type resolution would require apertures on the order of kilometers. Therefore, with the exception of synthetic apertures, imaging is seldom employed at such long wavelengths because of the exceedingly poor image quality. Radars generally just present blips rather than image shapes. However, microwave imaging can be obtained with near optical quality by employing ratios of aperture dimension-to-wavelength that are similar to each other in both spectral regions. This has been shown experimentally at the University of Pennsylvania using arrays of smaller antennas that are equivalent to kilometer-order apertures.

I.1. System Point-of-View

This textbook is about imaging, particularly from the system-engineering point of view. The approach is relatively broad, being applicable all across the spectrum, with advantages and disadvantages of imaging in various spectral regions being considered. The system engineering approach is important in system design and system analysis so as to determine final image quality or to design towards a given required image quality. This requires quantitative criteria with which to define image quality. It also requires analytical tools with which to determine the weak link in the system which has the chief role in limiting the image quality. If either mechanical vibrations or atmospheric effects is the weak link, for example, it makes no sense in noise-limited imaging to use a high resolution imager, with capability of providing detail 10 times smaller than that permitted by the vibrational or atmospheric blurring, because the high resolution imager will only permit the blur itself to be viewed clearly without providing the small detail required in the final image. In such cases a low-resolution sensor, much less expensive, will provide essentially the same final image quality as the high resolution sensor. However, for contrast-limited imaging it is possible to restore the image and get rid of essentially all the corrupting blur if the image degradation can be characterized correctly. In such cases, higher resolution sensors can be advantageous. Thus, good engineering design must also relate to system analysis and to cost-effectiveness.

An engineering tool very appropriate for system design and analysis is the concept of transfer function. It relates output and input to imaging system. Network transfer functions are an essential tool in electrical engineering. Similarly, optical transfer functions are an essential tool in image systems. Much of this book, therefore, is devoted to the concept and applications of optical transfer functions, including their relationship to target acquisition probabilities. In most imaging systems the weakest link is often vibrational or atmospheric blurring. Consequently, Sections IV and V are added so as to present specific optical transfer functions with which to describe such phenomena. They are then applied in Section 6 to image restoration and system analysis. Firstly, however, the concepts basic to geometrical optics, transfer functions, devices, and imaging in general are presented in tutorial fashion in the earlier chapters.

I.2. Typical Imaging System

A block diagram for a typical imaging system is presented in Fig. I-2.

An important parameter to consider in the object plane scene is contrast between object and background. This varies, often very strongly, with wavelength. In the open atmosphere it varies also with weather. The propagation channel, such as atmosphere, space, water, or optical fibers, for example, affects not only the transmission of the object plane scene radiation to the receiver, but may also introduce distortion, image blur, noise, and background radiation which vary with wavelength and magnification. At the receiver end, optical filters can be selected to pass those wavelengths at which contrast and signal-to-noise are best. The optics are used to collect the received radiation and to form the image which is projected onto the imager which must be situated in the image plane. The imager may consist of a film camera, a TV system, a thermal imaging system, a holographic system, etc. The image may be processed for image restoration to improve resolution, target acquisition times and probabilities, etc. The image display should be appropriate to the observer's requirements. For imaging systems involving image motion, either because the receiver or the object or both are moving, image motion compensation is usually essential so as to provide a relatively still picture. When mechanical vibrations are involved, such as in robotics and in moving reconnaissance systems, a stabilizer is almost always necessary so as to limit image blur. Knowledge as to transfer function characterizing image blur can be used in restoration in order to remove effects of image blur.

I.3. Organization of Book

Chapter one lays the theoretical background with which to consider image propagation through the propagation channel and the conceptual foundation for geometric optics. Chapter two uses geometric optics to explain image formation.

Part 2 presents the tools needed to predict image brightness and image signal-to-noise ratio, which are applied later in Part 3 in system analysis for image contrast, target acquisition statistics, and observer performance.

Part three deals with Fraunhofer diffraction and the optical fourier transform. These form the basis with which to discuss diffraction-limited imaging and the mathematical concept of optical transfer function (OTF), including modulation transfer function (MTF). The physical meaning of MTF is made clear from the definition of modulation contrast function (MCF). Sine-wave and square-wave response are compared, as are also incoherent and coherent imaging. The use of transfer function concepts in system design, particularly for incoherent radiation, is explained in chapter 10 for contrast-limited imaging and in chapter 11 for noise-limited imaging. Resolution is developed quantitatively in the form of target acquisition modeling. Imaging devices and the human visual system, particularly from the standpoint of MTF and required threshold contrast, are summarized at the end of Part 3. This is followed by discussions on MTFs for image motion and vibrations and the atmosphere in Parts 4 and 5, respectively. Part 6 considers image restoration based on the MTF limiting image resolution. Examples of such deblurring are presented for images distorted by the atmosphere and image motion. System design examples there involving target acquisition probabilities, times, and ranges integrate much of the basic concepts and principles found throughout this book. This is performed both prior to and following image restoration, so that the improvement brought about by the restoration is then quantified.

Numerous worked-out examples are presented in almost every chapter to illustrate the mathematical concepts and to give them physical meaning. Exercises are presented at the ends of chapters to supplement the texts. They actually form an important and integral part of the teaching text, and for this reason a fully-worked out solution manual is provided.

The general approach utilized in the text is imaging principles rather than devices, with emphasis on system concepts. It is intended as a senior year undergraduate or as a graduate-level engineering text.


© SPIE. Terms of Use
Back to Top