Augmenting Reality Poses Challenges for Optics

When it comes to VR's more advanced cousin, AR, there's still work to be done before the world sees commercially viable visors.

01 January 2019
Neil Savage

At the beginning of 2019, many people have probably already tried a virtual reality (VR) headset. They're available in stores for video gamers and have been integrated into rides at Disney World. But when it comes to VR's more advanced cousin, augmented reality (AR), there's still a lot of development needed before the world sees commercially viable visors.

According to Bernard Kress, the partner optical architect for the HoloLens team at Microsoft and SPIE Board Member, "Everybody agrees, the optics aren't ready for primetime." And yet the tech industry is fully invested in tackling those technical hurdles, from large companies like Google and Microsoft, to smaller businesses like Florida-based Magic Leap.

In VR systems, the main goals are to improve the optical performance and bring the price down. For AR, the challenge is greater. The aim is to make headsets that look and feel a lot like sunglasses, that can show you the real world but overlay additional images or information, and to do so in a way that doesn't cause eye fatigue or cost too much. "When you do augmented reality you want to have a thin window in front of your eye through which you look at the real world," says SPIE Fellow Jannick Rolland, professor of optical engineering at the Institute of Optics at the University of Rochester, New York, and director of the National Science Foundation's Center for Freeform Optics. "And then that window is also a way to send the virtual image to your eye."

While virtual reality attempts to immerse the user in an artificial world, augmented reality tries to add to the real world. Possible applications include the ability to lay an MRI or x-ray image over a patient so a surgeon can see where she's operating, or projecting a wiring schematic on top of a bulkhead for an engineer. Crime investigators in The Netherlands have already used AR to project images from a crime scene to experts back in a lab, who then make annotations visible via a headset worn by the onsite investigator.

In AR, some of the challenges include designing optics that will pass most of the light from the outside world, while controlling the light from the display so that the image doesn't suffer from chromatic aberration or stray light, making sure the virtual image appears to be the right distance from the eye, and having a wide enough field of view that the image doesn't vanish when the wearer's eyes move. The optics have to be relatively easy to manufacture at commercial scales, without driving the cost of the headset higher than consumer comfort.

Further, the whole system has to take aesthetics into account. "No one wants to walk around with a bucket on their head the whole day," says SPIE Fellow Mark Brongersma, a professor of materials science and engineering who leads the Plasmonics and Photonics group at Stanford University. That's much less a factor in VR, Rolland says. "In virtual reality, you are in your home or training or something, so you don't care as much," she says.

All those factors drive the specifications the optics will need to meet. Once a designer has settled on the architecture for a display system, engineers can turn to emerging optical technologies to build the necessary components. Among those new technologies are freeform optics and metasurfaces.

Going Freeform

Freeform optics is the practice of designing individual surfaces of optical components without rotational or translational symmetry, unlike the spherical or aspherical surfaces in traditional optical elements. The most well-known component may be a freeform prism, commonly used to fold the path of light in head-mounted displays: light from a display is coupled into the prism through one surface, guided through the bulk through total internal reflection—perhaps in combination with a reflective coating—and then sent on through another surface of the prism. "Each of the surfaces is not a simple surface but a freeform surface to do aberration correction," says SPIE Fellow Hong Hua, a professor of optical sciences at the University of Arizona in Tucson. The shape of each surface is individually defined to perform better wavefront correction than would be available by using a series of traditional lenses.

A freeform eyepiece would be preferable to using a stack of multiple lenses to deliver an image from a microdisplay worn on a person's temple, Hua says. A stack would be too thick and block too much light from the outside world. Instead, Hua has been working on a design in which she folds optical surfaces into a waveguide, bending the optical path through multiple refractions.

Another issue optical engineers are trying to address with freeform optics is the problem of vergence accommodation conflict. The human eye naturally adjusts its focus based on how far away it perceives an object to be. So if an AR headset projects an object a fraction of an inch from the eyes, but it seems to be much farther away, the eye will try to change its focus. This problem can lead to headaches and eye strain.

Focus cue mismatch

Example of focus cue mismatch between physical and virtual objects in a conventional optical see-through head-mounted display.

To help with this common problem, Hua has developed a multifocal plane, essentially a lens with different concentric focal planes. "The idea is instead of having one focus for the system, you can basically create a stack of focus," she says. A spatial-light modulator projects an image onto each focal plane in sequence while blocking the others out. If the switching happens fast enough, the eye doesn't notice that it's happening but still has the ability to focus at different distances.

Going Wide

Another challenge for AR headsets is having a large enough field of view (FoV), which is how wide of a scene the eye can see. Barmak Heshmat, a research scientist at Massachusetts Institute of Technology in Cambridge, Massachusetts, says the ultimate goal would be to have an FoV of 154 degrees for each eye, and Rolland says the minimum acceptable is 60 degrees. There are tradeoffs, however. For one thing, a wide FoV means more optics, adding bulk to the system. For another, a wider field of view results in a smaller eye box, which is the volume of space near the eye where the eye can scan without wandering too far away from the AR projection and having the image vanish. "You want a wide field of view and you also want a large eye box," Rolland says, "and the challenge is to get both."

One way some engineers are addressing the problem is by using holograms. Microsoft, for instance, argues that holographic displays solve many of the problems in AR headsets, allowing lightweight displays with less aberration and multiple focal planes in a wide field of view. Microsoft researchers described their work at the SIGGRAPH computer graphics conference in 2017.

Their design offers an 80-degree FoV horizontally. The hologram consists of a series of smaller "sub-holograms" that each focus light at a different depth of the scene, allowing them to focus each pixel individually. The design is flat and compact, and uses the holographic projector to correct aberrations.

Apple might be thinking along those same lines, since it purchased Colorado-based company Akonia Holographics in 2018: a company working on holographic lenses for augmented reality headsets.

Going Meta

Another technique to meet the demands of VR and AR optics is the use of metasurfaces. Metasurfaces contain periodic structures-a series of pillars, for instance-with dimensions about the same size as the wavelengths of light they're dealing with. That sets up optical resonances and gives engineers extremely fine control over light, even producing a negative index of refraction in some cases. "These structures look completely alien at first observation, but they have a much higher ability to focus light however you want," Brongersma says.

Working with researchers from Magic Leap, he's built gratings that consist of beams of silicon 30-nm wide and 75-nm thick, spaced 125-nm apart. The nanobeams act as optical antennas, redirecting the light where he wants. He's also studying other high-index materials such as titanium dioxide or gallium nitride-the refractive index determines the resonant wavelength a metasurface can achieve. The metasurfaces are coated onto a standard lens, with individual coatings for different wavelengths and polarizations. For instance, one coating could be made to pass visible light but reflect infrared light. The headset could then direct IR light to the user's cornea and collect reflections from the eye to see where the user's gaze was directed, so it can project the image to the right place.

Manufacturing metalenses will be a challenge for industry, according to Brongersma. He uses electron beam lithography to build his gratings over an area of roughly 100 microns, but that would be far too expensive a process to cover the couple square inches of the lens on a pair of glasses. And the structures have to be built with an accuracy below 10 nm.

metalens

This silicon-based metalens for 3D lightfield imaging uses an array of lenses with displaced optical axes.

Metasurfaces could have a lot of advantages for AR and VR, says SPIE Member Arka Majumdar, professor of electrical engineering and physics at the University of Washington in Seattle. "You can make a very thin lens," he says. "You can use all the light." But the surfaces are not easy to create. "Designing these things requires a huge amount of computational power," he says. Majumdar has founded a company, Tunoptix, that aims to use metasurfaces with microelectromechanical systems for both AR and endoscopy applications.

However the designs and manufacturing questions shake out, Majumdar is certain that metasurfaces will play a role. If the aim is build a pair of high-tech glasses that look like something out of Mission: Impossible, the only way to achieve that compactness and performance is with a metalens. "I'm not saying metasurfaces will solve it," Majumdar says. "I'm just saying without metasurfaces there is no way we are going to get to that form factor."

It's unlikely all the demands of AR headsets could be met by optics alone, so many people working on the technologies are thinking about how to marry their optical design with computational tricks. The headsets are already doing a lot of computation to produce the images or track the eyes, Majumdar says, so why not use some of that power to improve what's being sent to the optical elements? Indeed, Microsoft's SIGGRAPH paper talks of a computational approach, where the hardware is simplified and much of the wavefront control is the job of the software. Rolland agrees. "Not everything has to be done optically. Some things can be done in software," she says.

There's still a lot of work to do. In his free online SPIE Course, "Introduction to VR, AR, and MR," Kress points out that Apple's CEO Tim Cook acknowledges that the quality of AR displays is "not there yet." Nonetheless, Apple is investing massively in AR. "He knows that it's not for tomorrow," says Kress, "it's for the day after tomorrow."

---

Neil Savage is a science and technology writer in Lowell, Massachusetts.

 

Enjoy this article?
Get similar news in your inbox
Get more stories from SPIE
Recent News
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research