SPIE Membership Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:

SPIE Photonics West 2019 | Register Today

SPIE Defense + Commercial Sensing 2019 | Call for Papers



Print PageEmail PageView PDF

Illumination & Displays

Head-up display: not as easy as it seems!

Pilots can now see video and retrieve information from infrared sensors without looking down at instrument panels, but making it right requires careful design.
30 September 2007, SPIE Newsroom. DOI: 10.1117/2.1200709.0859

Flight decks in growing numbers are being fitted with head-up display (HUD) devices (see Figure 1) that provide all necessary information to pilot the aircraft, employing graphical symbology as overlay to real-world representation. One significant new feature displays video from an infrared camera that can see through darkness and some types of fog, providing an enhanced view of the scene ahead. Video from sensors can also indicate heat from other aircraft, trucks, and even animals on the runway.

However, implementing video HUD display on aircraft presents unique challenges on several fronts. First, the quality of video requires improved methods of transmission and display. In addition, video content should show and convey maximum information. Finally, controls that the pilot uses to adjust the display need to be optimized.

Although HUDs have been on fighter jets for many years, their use on corporate, airline, and military transport aircraft over the past decade has been a popular safety-enhancing innovation. Introduction of affordable infrared cameras for use on HUD has made it possible for these aircraft to safely land and take off in conditions of reduced visibility. With most pilots in agreement that an infrared camera view and/or a computer-generated artificial view, known as synthetic vision, significantly improves safety, manufacturers are rapidly installing this feature. Practical problems associated with adding video to HUD have come to light during flight tests and regular use, leading to refinements that will make the system easier to use and more effective.

Figure 1. The HUD is a unique display that overlays flight information on the outside world view

With innovation, the benefits of enhanced and synthetic vision on the HUD can help pilots in an ever wider variety of weather conditions while reducing the need for extra training and experience now required for optimal use of the system.

At Rockwell Collins, we have experience with several programs that use video display on HUD in various lighting conditions during actual flight. These real-world situations have highlighted a number of challenges.1 Technical shortcomings with the hardware on some cathode-ray-tube-based HUD installations, for instance, mean that the video does not use all pixels available from the sensor, which can limit the amount of symbology able to be displayed. We developed a buffering method that removes this display limitation. Another issue concerns video brightness, which must be sufficient to properly show the full range of information available from the sensor. New LED light sources enable the HUD to make use of the maximum number of gray shades in the video image. In addition, video contrast must allow the pilot to discern the flight information symbology. Contrast control algorithms and haloing (see Figure 2) can ensure proper contrast.

Figure 2. Symbology haloing surrounds important flight information with a clear zone free of video

Similarly, a noisy video image makes information difficult to see and may be dangerous if it obscures objects in the outside world view. Use of digital interfaces and noise reduction algorithms can help minimize video noise.

Optimizing the content of the video image involves making sure it is accurately aligned with the outside world view, and that the field of view matches it with no noticeable lag in the image during normal aircraft maneuvers. Filters and enhancement algorithms can improve the information content, and fusing images from different sensors can help.

Finally, controls must enable pilots to clear the HUD display of all imagery when necessary. Smart brightness controls can optimize luminance and contrast in various external lighting conditions. Increasingly, aircraft manufacturers recognize that benefits from an infrared sensor or computer-generated view on HUD represent an important safety measure. The complex interaction of sensor, display, and the human eye make integration difficult, however, and some early innovations have proved less than perfect. Experience gained from installing, optimizing, and certifying these systems is contributing to lessons learned that should improve performance with each new installation.

Peter Howells
Head-up Guidance Systems
Rockwell Collins
Portland, OR

Peter Howells is a Staff Engineer at the Head-up Guidance Systems unit of Rockwell Collins in Portland Oregon. He has been involveed in design and development of aircraft cockpit equipment for 30 years. He received his degree in electrical engineering from Salford University in England. Howells has published numerous papers on designing equipment that helps pilots fly aircraft.