SPIE Digital Library Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
    Advertisers
SPIE Defense + Commercial Sensing 2017 | Call for Papers

Journal of Medical Imaging | Learn more

SPIE PRESS




Print PageEmail PageView PDF

Electronic Imaging & Signal Processing

Toward realistic night-vision simulation

A novel color-transformation method converts daytime scenes into night-vision goggle images suitable for realistic nighttime training.
27 March 2009, SPIE Newsroom. DOI: 10.1117/2.1200903.1573

To date, nighttime training using flight or driving simulators has been hindered by the lack of typical night-vision goggle (NVG) rendering. NVG performance is often simulated in training vehicles by converting daytime images into black and white, and displaying the resulting scene in green. However, this approach does not consider that NVG images are fundamentally different from daytime scenes. For instance, elements such as plants and trees that contain chlorophyll appear bright in NVG images while they are dark in black and white (see Figure 1). For effective training, the illusions and limitations associated with NVGs must be captured and retained.1

New methods have recently become available that enable the creation of sensor effects, such as noise and halos, around light sources. However, image intensifiers are particularly sensitive to near-IR (NIR) light, so this information is often disregarded. In addition, reduced contrast and resolution cause NVG images to appear significantly different from daytime imagery. All of these effects give rise to misinterpretations and illusions. When conducting training on how to deal with these effects, it is essential to simulate the typical properties of NVG imagery.1


Figure 1. Demonstration of the differences between (right) black-and-white daytime images and (left) night-vision goggle (NVG) performance.

We recently developed a method to fuse night-vision sensors and provide them with natural daytime colors (‘Color-the-Night’), thus making the images much easier and faster to interpret.2–4 This technique can also be applied inversely, by transforming daytime images of a training simulator into realistic-looking nighttime NVG images (‘NVG-the-day’). Figure 2 shows an example of daytime and NIR images of the same scene. A standard black-and-white version looks very different from its NIR counterpart. Our method performs an optimal transformation from color (based on the daytime input scene) to grayscale, using the NIR image as reference, eventually resulting in a simulated NVG image that closely matches the NIR input. Note that even the moss on the pavement becomes visible, as is also apparent in the NIR image.


Figure 2. NVG-the-day color transformation. While a standard black-and-white conversion looks very different from the near-IR (NIR) image, the NVG-the-day result closely resembles the NIR image.

To create realistic night vision in our flight simulator, we used IKONOS-satellite images that contain three visible filters and a NIR band. Using an IKONOS image as reference, our method can work out the conversion from daytime colors to NIR that closely matches the NIR image. We applied our method to different IKONOS images (see Figure 3) and found that the standard black-and-white conversion poorly resembles the NIR images. As expected, a color transformation based on the same image best matched the NIR image, achieving the highest correlation between the pixel values. When the color scheme derived from a Baghdad image was applied to an image of The Hague, the prediction was quite poor. However, when using a color transformation from a dataset of a similar region, the predictions closely matched those for the image from which it was derived. This suggests that different color schemes should be used for different types of landscapes.


Figure 3. Application of the NVG-the-day technique to an IKONOS-satellite image (left). Shown are (center) the real NIR image and (right) the NVG-the-day prediction.

We applied color transformations derived from satellite imagery to the daytime textures of a flight-simulator database to create a nighttime environment. Alternatively, we can apply color transformations to the simulator image ‘on the fly’ (see Figure 4) to represent chlorophyll-containing plants more realistically. We will next combine algorithms that model sensor effects such as halos, noise, and reduced resolution with the NVG-the-day method to create a realistic training experience. Validation studies must show whether this improved visualization leads to realistic illusions, and would therefore allow for effective nighttime training.


Figure 4. Application of the NVG-the-day technique to flight-simulator imagery. A perspective-distorted NIR image with overlapping content (bottom right) confirms the validity of the transformation used.

Maarten Hogervorst
Human Factors
TNO Defense & Security
Soesterberg, Netherlands

Maarten Hogervorst received his PhD in physics from the University of Utrecht in 1996 for his work on visual perception. After a brief appointment at Oxford University, UK, he joined TNO Human Factors as a research scientist. His current research interests include visual-information processing, human search, and image-quality assessment and enhancement.