Share Email Print

Spie Press Book

High Dynamic Range Imaging: Sensors and Architectures, Second Edition
Author(s): Arnaud Darmont
Format Member Price Non-Member Price

Book Description

Illumination is a crucial element in many applications, matching the luminance of the scene with the operational range of a camera. When luminance cannot be adequately controlled, a high dynamic range (HDR) imaging system may be necessary. These systems are being increasingly used in automotive on-board systems, road traffic monitoring, and other industrial, security, and military applications. This book provides readers with an intermediate discussion of HDR image sensors and techniques for industrial and non-industrial applications. It describes various sensor and pixel architectures capable of achieving HDR imaging, as well as software approaches to make high dynamic range images out of lower dynamic range sensors or image sets. Some methods for automatic control of exposure and dynamic range of image sensors are also introduced. This edition introduces CMOS pixel and image sensor design concepts and circuits.

Book Details

Date Published: 10 April 2019
Pages: 184
ISBN: 9781510622784
Volume: PM298

Table of Contents
SHOW Table of Contents | HIDE Table of Contents

Table of Contents

1 Introduction
1.1 Applications Requiring a Higher Dynamic Range
1.2 High Dynamic Range Photography
1.3 Scientific Applications
1.4 High Dynamic Range, Wide Dynamic Range, and Extended Dynamic Range
1.5 Reducing the Exposure Time
1.6 HDR Applications That Do Not Require HDR Images
1.7 Image Histograms
1.8 Outline and Goals
1.9 Defining a Camera

2 Dynamic Range
2.1 Image Sensor Theory
     2.1.1 Light source, scene, pixel, and irradiance
     2.1.2 Sensing node and light-matter interaction
     2.1.3 Pixel
     2.1.4 Pixel array
     2.1.5 Readout circuits
     2.1.6 Image encoding
2.2 Low-Light Imaging Limitations
     2.2.1 Noise sources summary
     2.2.2 Lowest detectable limit
2.3 Bright-Light Imaging Limitations
     2.3.1 Saturation
     2.3.2 Highest detectable level
2.4 Signal-to-Noise Ratio
2.5 Dynamic Range Gaps
     2.5.1 Response curve
     2.5.2 Dynamic range gaps
     2.5.3 Presence function of dynamic range gaps
2.6 Dynamic Range
     2.6.1 Definition
     2.6.2 Remark
     2.6.3 Relative measurement
2.7 Image Information
2.8 Image Information of a Real Scene
2.9 Human Vision System and Its Dynamic Range
     2.9.1 General properties of human vision
     2.9.2 Dynamic range of the human eye
     2.9.3 Noise perception

3 Hardware Methods to Extend the Dynamic Range
3.1 Introduction: Integrating Linear Pixels
     3.1.1 Rolling-shutter pixel architecture
     3.1.2 Global-shutter-pixel architecture
     3.1.3 SNR and dynamic range study
3.2 Multilinear Pixels
     3.2.1 Principle
     3.2.2 How can multiple segments be realized practically?
     3.2.3 Equations of the 3T pixel: reset and multiple-segment HDR exposure
     3.2.4 Equations of the 3T pixel: readout
     3.2.5 Equations of the 3T pixel: power supply rejection ratio
     3.2.6 Charge injection and clock feedthrough
     3.2.7 Multiple-segment method based on well sizing
     3.2.8 Dynamic compression
     3.2.9 SNR and dynamic range study
3.3 Multiple Sampling
3.4 Multiple-Sensing Nodes
3.5 Logarithmic Pixels
3.6 Logarithmic Photovoltaic Pixel
3.7 Time to Saturation
3.8 Gradient-Based Image
3.9 Light to Frequency
3.10 Multiple Readout Gains
3.11 Other Methods
3.12 Multiple-Exposure Windows
3.13 Combined Methods within One Sensor
3.14 Summary
3.15 Companding ADCs
3.16 Extended-Dynamic-Range Color Imaging
3.17 LED Flicker Mitigation
3.18 Sensors Used in Applications
3.19 3D Stacking
3.20 Packaging Issues

4 Software Methods to Extend the Dynamic Range
4.1 General Structure of a Software Approach
4.2 High Dynamic Range Image Data Merging
     4.2.1 Ideal case
     4.2.2 Real sensors and cameras
     4.2.3 Debevec's algorithm
     4.2.4 Alternate method: Mann and Picard
     4.2.5 Alternate method: Mitsunaga and Nayar
     4.2.6 Alternate method: Robertson et al.
4.3 Noise Removal
     4.3.1 Temporal pixel noise
     4.3.2 Ghosts and misalignments
4.4 Tone Mapping
4.5 Software Methods Applicable to Certain Image Processing Applications
4.6 Sensors with Integrated Processing
4.7 Simulated High Dynamic Range Images

5 Optical Limitations
5.1 Lens Glare
5.2 Modulation Transfer Function
5.3 Conclusions

6 Automatic High Dynamic Range Control
6.1 Automatic Exposure of Linear Sensors
     6.1.1 Principle
     6.1.2 Brightness calculation
     6.1.3 Filtering and stability for machine vision
     6.1.4 Filtering and stability for display
     6.1.5 Guard-band-based filtering
6.2 Automatic Exposure of High Dynamic Range Sensors
6.3 Offset Control

7 High Dynamic Range File Formats
7.1 Color Space
     7.1.1 Introduction
     7.1.2 Color space definition
7.2 Storing Image Data of Extended Dynamic Range Cameras
7.3 Storing Data of Radiance Maps and High Dynamic Range Software: Direct Pixel Encoding Methods
     7.3.1 IEEE single precision floating point
     7.3.2 PixarTM log encoding
     7.3.3 Radiance RGBE
     7.3.4 SGITM LogLuv TIFF
     7.3.5 Industrial Light and MagicTM OpenEXR
     7.3.6 Unified ColorTM BEF
     7.3.7 Microsoft/HPTM scRGB
     7.3.8 JPEG XR
     7.3.9 Summary of file formats
     7.3.10 Gen<I>cam
7.4 Storing Data of Radiance Maps and High Dynamic Range Software: Gradient-Based and Flow-Based Methods

8 Testing High Dynamic Range Sensors, Cameras, and Systems
8.1 Testing of Software-Based Systems
8.2 Testing of Non-High Dynamic Range (Linear) Sensors and Cameras
     8.2.1 The ISO approach
     8.2.2 The EMVA1288 approach
8.3 Testing of High Dynamic Range Sensors and High Dynamic Range Sensor-Based Cameras
     8.3.1 The ISO approach
     8.3.2 The EMVA1288 approach
     8.3.3 Two-projector approach
     8.3.4 Projector-and-display approach
8.4 Contrast Detection Probability

9 Dynamic Range in Non-Visible and 3D Imaging Devices
9.1 Infrared Imaging
9.2 3D Imaging

10 Conclusions
10.1 Important Figures of Merit of a High Dynamic Range Sensor
10.2 Questions


This book collects the knowledge about image sensors, dynamic range, high dynamic range (HDR) image sensors, and HDR applications gained from over 16 years in the image sensor and imaging business as an engineer, project manager, researcher, instructor, business development manager, and consultant. With my first employer, Melexis, I worked on one of the first HDR global shutter CMOS image sensors and its related reliability and production testing. At that time, very few publications were available on in-pixel HDR imaging and its issues and difficulties, even though several companies (such as FillFactory, IMEC, PhotonFocus, Awaiba, Melexis, and Micron) already had device prototypes and were building the knowledge internally. No company had experience with the full production testing of such devices, and everything had to be made to meet automotive standards.

At Aphesa, several of our custom camera projects or consulting projects were related to or involved HDR. We had several pipe inspection projects, including oil and gas applications, or medical endoscopy applications in which lighting was very difficult to control and the irradiance of the scene was very uneven. The scenes also had some highly reflective parts and lowly reflective parts simultaneously. HDR techniques were implemented in the sensor, in the camera, or in the host software. Later, we saw more multispectral or hyperspectral projects that had very large differences in the signal levels between bands, causing imaging difficulties similar to what we had encountered in HDR, and therefore techniques derived from HDR were used.

When the first edition of this book was published in October 2012, it was the first comprehensive text about HDR techniques used in pixels and in cameras or software with an engineering level of technical details. Since then, HDR photography has become more popular.

In 2015, I started teaching image sensors and imaging more widely than the SPIE HDR course that inspired this book, and a lot of questions were raised by attendees. It became necessary to update the book with the answers to some of these questions.

In 2017, I joined the European Machine Vision Association as part-time manager of standards, and one of the first actions taken was to extend the EMVA1288 standard to be compatible with HDR image sensors and cameras. There is also an initiative to extend the Gen<I>cam standard to offer HDR pixel formats and compatible data containers and controls. Since the first edition, things have evolved quickly in the field of HDR because of the use of HDR in consumer markets such as DSLR and mobile, and also because of the development of autonomous vehicles, drones, and the latest generation of camera-based driver-assistance systems.

Therefore, the philosophy of the book has evolved. The explosion of HDR applications has led to a significant increase in the number of algorithms and publications on the topic, so this volume serves as a starting point for exploring HDR imaging by introducing the core concepts with schematics and equations and going deep into the general principles.

This book provides readers with an intermediate-to-advanced knowledge of HDR image sensors and techniques for industrial and non-industrial applications. HDR is increasingly being used in automotive on-board systems, autonomous vehicles, road traffic monitoring, and in industrial, security, medical, and military applications, as well as in photography. It offers advantages such as greater robustness against direct sunlight or reflection of strong lights on metals, and better detection of objects located in shadows. The book is not about the artistic side of HDR images, although some aspects of HDR photography are mentioned, and several photographs are included for illustration. Some aspects of system testing are also introduced. Instead, it describes various sensor and pixel architectures to achieve HDR imaging, as well as software approaches to create HDR images out of lower dynamic range sensors or image sets. Some methods for automatic control of exposure and dynamic range of image sensors are introduced. The most important optical effects are also considered.

Arnaud Darmont
September 2018

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?