What's the hold up with heads-up?

The technology to immerse drivers in augmented reality is—finally—almost here. Is that a good or bad thing?
01 March 2021
By Bob Whitby
A driving simulator at Virginia Tech’s COGENT Lab
A driving simulator at Virginia Tech’s COGENT Lab. Credit: Nayara de Oliveira Faria

If you missed the 2019 animated movie Spies in Disguise, here's a quick synopsis: Lance Sterling, self-described "world's greatest spy," sports a flawless tuxedo while saving the world and shape-shifting between a human and a pigeon. There are twists and turns along the way, but the plot details aren't important here. More germane to science is Lance's sinuous Audi RSQ E-Tron sports car. It's an electric-powered marvel with a 300-mile range, self-driving at the touch of a button, and fast enough to outrun a missile. All well and good, and not too deep into the realm of the fantastic, save for the missile-evading ability.

During the obligatory chase scene, however, the car's luminous, volumetric, holographic heads-up display is proudly on display, as if to tease something you'd be able to buy off the showroom floor next year. And that's where the animation/reality schism widens.

In 2021, decades after heads-up displays first arrived in cars, drivers still glance down at the dashboard or look at simple windshield reflections to figure out how fast they're going and where to turn next. Augmented reality (AR) heads-up displays (HUD) that mix information from computers and sensors into the users' field of view-distinct from virtual reality (VR), which is entirely computer created-are a reality in aviation, medicine, and industry. Cars can drive themselves these days, but an AR HUD on your dashboard is still (mostly) science fiction.

Old news

The idea of putting information directly in front of the driver seems intuitive. At 55 mph a car travels the length of a football field in five seconds, about the time it takes to send or read a text. In 2018, distracted driving resulted in 2,841 deaths in the United States, according to the National Highway Traffic Safety Administration. Anything that keeps eyes up and on driving should be a good thing.

And HUDs are not new. In aviation, they date to the end of World War II when rudimentary systems were installed in a few military planes. A version of the British Havilland Mosquito had a system that projected radar information on a small piece of glass mounted directly in the pilot's line of sight.

In modern military aircraft, AR enables pilots to have an almost omniscient view of the world. Cameras mounted on the body of the plane feed a seamless 360-degree panorama to the pilot's helmet, allowing the pilot to "see through" the structure as if it wasn't there. Flight data, weapons status, and target information are incorporated into a world-relative view that, combined with night vision and forward-looking infrared radar, minimizes weather and terrain restrictions. The pilot can lock on a target with an eye gaze.

AR has also proven highly adaptable in other fields. Medical researchers at Imperial College London used a Microsoft HoloLens wearable AR headset to "see through" tissue and reconnect blood vessels during surgery; researchers in Finland used the HoloLens to make repairs to the International Space Station faster and more precise; German scientists created a system for manipulating a prosthetic hand through the use of AR glasses with an integrated camera for tracking.

In the automotive realm, HUDs aren't new either; the 1988 Oldsmobile Cutlass Supreme projected a digital speedometer readout onto the windshield. But futurists dream of full-windshield displays that can do everything from highlighting the shape of a pedestrian in a crosswalk on a dark, foggy night, to pointing out landmarks in a city as the car passes, to letting you know that your favorite coffee shop has a special on lattes. Like hydrogen-powered cars, this level of AR HUD always seems just a few years off.

A holographic waveguide combiner for HUD
A holographic waveguide combiner for HUD. The original image is injected inside the WG on the lower right using a holographic optical element coated on the waveguide. After propagation inside the waveguide, the image is extracted by another hologram on the upper left and projected toward the viewer. Credit: Pierre-Alexandre Blanche

Optimizing the optics

In a 2014 paper, researchers from Virginia Tech wrote about both the promise and the reality of automotive AR HUDs. "While we are already seeing some successful video-based augmented reality auxiliary displays (e.g., center-mounted backup aid systems), the application opportunities of optical see-through AR as presented on a drivers' windshield are yet to be fully tapped; nor are the visual, perceptual, and attention challenges fully understood. As we race to field AR applications in transportation, we should first consider the perceptual and distraction issues that are known in both the AR and transportation communities, with a focus on the unique and intersecting aspects for driving applications."

The technological challenges are complex, involving not only the sensors and cameras that feed information to the AR HUD, but also the inverse relationship between the size of the display's eye box and field of view. The eyebox is the eye position at which the driver can see the full image; when it's small, a slight shift of the head cuts off the image. Field of view refers to the horizontal and vertical dimensions of the image. Humans have about 120 degrees of static, horizontal, binocular vision. A typical HUD found in a car today is less than 10 degrees.

Solving the problem isn't simply a matter of making the image bigger. "The problem with most of the heads-up displays and augmented reality is that eyebox and field of view are inversely proportional to one another," said Pierre-Alexandre Blanche, research professor of optical sciences at the University of Arizona. "So if you increase one the other decreases, and vice versa. People are trying to have an extremely large field of view to have a comfortable image and extremely large eyebox, and that's also a high level of difficulty to overcome some basic optical limitations."

In headset applications such as the HoloLens, AR designers have the advantage of a combiner that's close to, and moves with, the wearer's eyes. Therefore a small eyebox can still accommodate a relatively large field of view. Automotive AR HUD designers have no such luxury.

A view as experienced by an observer looking at the HUD
A view as experienced by an observer looking at the HUD. The runway strip is located at infinity and the avionic symbology is displayed by the HUD and overlaps with the environment. There is no need for the viewer to refocus their gaze when looking at the information or the outside world. Credit: Pierre-Alexandre Blanche

At the most basic level, a HUD projects an image on a semitransparent combiner, typically made of treated glass or plastic, that overlays the reflected image on the "real world" visible through it. They can be as simple as placing your cell phone on the dashboard and viewing the resulting Fresnel reflection in the windshield, or as complex as holograms that diffract specific wavelengths and act as lenses to increase the field of view. In any case, the image size and field of view are constrained by the physical size of the relay optics in the system.

Waveguide technology is a promising solution to the eyebox/field of view issue. It's a complex name for a relatively simple idea, Blanche said. "Waveguide heads-up displays are based on the fact that you can inject an image inside a two-dimensional guide, which is a very pedantic way to say a piece of glass."

In a waveguide, an image is introduced into the glass through a holographic optical element. It's confined by the index difference between the glass and surrounding air, until it is extracted and directed toward the viewer at several points by a second holographic optical element. In an experimental design, Blanche was able to increase the longitudinal magnification of the original image by incorporating optical power into the injection hologram, and boost eyebox size by increasing the size of the extraction hologram.

Ultimately, the only limitation is the size of the waveguide itself, said Blanche. "The goal is to have a heads-up display that the driver can see all across the windshield, so an extremely large field of view, like 100 degrees, and eventually have the passenger able to see different information on the entire field of view on the windshield, which is technically possible," said Blanche, adding, "Well, I said ‘technically,' but I should have said ‘theoretically.'"The future is inching closer. DigiLens, a California company specializing in waveguide AR HUDs, announced in September 2020 the market-ready status of their CrystalClear unit which produces a 15-degree by 5-degree field of view with equipment packaged in only five liters of space, about three times smaller than current AR HUD offerings. DigiLens said the system produces 12,000 nits of luminance—on par with a high-end TV—with information displayed at infinity so drivers don't have to refocus from real-world objects to read what's on the screen. The extraction hologram corrects for windshield curvature, so no additional optics are required.

Also in September, Mercedes-Benz became the first car maker to offer an AR HUD. The system, available as an option in its S-Class sedans, features arrows that point the way around corners, lines that highlight the edge of a road, and a bar that flashes underneath cars ahead to warn that you're getting close.

The AR HUD in the Mercedes-Benz S-Class sedans
The AR HUD in the Mercedes-Benz S-Class sedans. Credit: youtu.be/rDcZG06e6dk

The human factor

The question of "can we?" will soon be answered in the affirmative. But what about the corresponding "should we?" Until cars drive themselves everywhere, all the time, humans need to pay attention to the road. Will AR HUDs help or hinder that goal?

Nayara de Oliveira Faria, a researcher who studies attention, perception, and cognition in augmented and virtual reality at Virginia Tech's COGENT Lab, notes standards that would help to answer that question don't exist yet.

"What I'm trying to do is develop new methods to evaluate the effects of augmented reality on distraction when you're driving," Faria said. "The standards we have for any type of in-vehicle displays—GPS or cell phone or anything including augmented reality head-up displays—the most common standard in the United States is the NHTSA guidelines for visual distraction, which in simple terms will tell you two seconds is the maximum that a driver can look away."

Drivers don't look away from HUDs. That's the safety selling point. Information appears in the field of view, ideally on the same focal plane as objects in the real world to eliminate the need for accommodation. But looking isn't the same as seeing, and seeing isn't the same as processing.

Faria has been investigating cognitive tunneling and inattentional blindness. The former is a phenomenon known to any driver who has tried to pay attention to the road while having a hands-off cell phone conversation. The latter is looking at something but not seeing it. When a driver stares directly at an approaching motorcycle and turns into its path anyway because nine times out of 10 he's looking for a car, that's inattentional blindness.

In one study conducted in the COGENT Lab's simulator, Faria had participants read a text on a HUD while following a car that suddenly hits the brakes. The study drivers never looked away from the road, but it didn't matter. "They crash, even though they are looking at [the car ahead]. Although it seems like you're paying attention, you're blind to the situation that just changed in your environment."

Researchers in the United Kingdom found that where information is placed on a windshield affected drivers' ability to stay in their lane. "If you put things right in the center you'll be able to stay in lane for a long period of time because you can keep the white lines in your periphery," said Gary Burnett, professor of transport human factors at the University of Nottingham. "As you get further and further out it'll get more difficult, basically forcing people to look at different places whilst trying to drive."

The real challenge, said Burnett, will be for designers not to make things overly complicated. The larger the field of view, the more information you have, the greater the temptation to fill it. "It's a consumer product designed for engagements versus designed to minimize distractions. Because if you want to minimize distraction you put as little on it as possible."

Burnett envisions an alternate scenario in which AR HUDs never progress much beyond their present level of development, at least until cars drive themselves and humans need something else to do. It's possible that headsets, or even AR contact lenses, leapfrog full-windshield AR HUD technology and cars will integrate with AR headsets the way they do today with phones, he said.

"It may well be that full-windshield heads-up displays never happen because everyone is just wearing glasses. How your glasses communicate with the car would be a bit like the way your phone communicates with your car."

Which, if true, will be a letdown. AR HUDs have been a long time coming. And Lance Sterling doesn't wear glasses.

Bob Whitby is freelance science writer based in Fayetteville, Arkansas.

Enjoy this article?
Get similar news in your inbox
Get more stories from SPIE
Recent News
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research