Share Email Print

Spie Press Book

Optical Architectures for Augmented-, Virtual-, and Mixed-Reality Headsets
Author(s): Bernard C. Kress
Format Member Price Non-Member Price

Book Description

Be sure to take the SPIE online course Design, Modeling and Fabrication Techniques for Micro-optics: Applications to Display, Imaging, Sensing and Metrology, with author and course instructor Bernard Kress. Click here to register.

This book is a timely review of the various optical architectures, display technologies, and building blocks for modern consumer, enterprise, and defense head-mounted displays for various applications, including smart glasses, smart eyewear, and virtual-reality, augmented-reality, and mixed-reality headsets. Special attention is paid to the facets of the human perception system and the need for a human-centric optical design process that allows for the most comfortable headset that does not compromise the user’s experience. Major challenges--from wearability and visual comfort to sensory and display immersion--must be overcome to meet market analyst expectations, and the book reviews the most appropriate optical technologies to address such challenges, as well as the latest product implementations.

Book Details

Date Published: 20 January 2020
Pages: 270
ISBN: 9781510634336
Volume: PM316

Table of Contents
SHOW Table of Contents | HIDE Table of Contents

Table of Contents

1 Introduction
Word of Caution for the Rigorous Optical Engineer

2 Maturity Levels of the AR/VR/MR/Smart-Glasses Markets

3 The Emergence of MR as the Next Computing Platform
3.1 Today's Mixed-Reality Check

4 Keys to the Ultimate MR Experience
4.1 Wearable, Vestibular, Visual, and Social Comfort
4.2 Display Immersion
4.3 Presence

5 Human Factors
5.1 The Human Visual System
     5.1.1 Line of sight and optical axis
     5.1.2 Lateral and longitudinal chromatic aberrations
     5.1.3 Visual acuity
     5.1.4 Stereo acuity and stereo disparity
     5.1.5 Eye model
     5.1.6 Specifics of the human-vision FOV
5.2 Adapting Display Hardware to the Human Visual System
5.3 Perceived Angular Resolution, FOV, and Color Uniformity

6 Optical Specifications Driving AR/VR Architecture and Technology Choices
6.1 Display System
6.2 Eyebox
6.3 Eye Relief and Vertex Distance
6.4 Reconciling the Eye Box and Eye Relief
6.5 Field of View
6.6 Pupil Swim
6.7 Display Immersion
6.8 Stereo Overlap
6.9 Brightness: Luminance and Illuminance
6.10 Eye Safety Regulations
6.11 Angular Resolution
6.12 Foveated Rendering and Optical Foveation

7 Functional Optical Building Blocks of an MR Headset
7.1 Display Engine
     7.1.1 Panel display systems
     7.1.2 Increasing the angular resolution in the time domain
     7.1.3 Parasitic display effects: screen door, aliasing, motion blur, and Mura effects
     7.1.4 Scanning display systems
     7.1.5 Diffractive display systems
7.2 Display Illumination Architectures
7.3 Display Engine Optical Architectures
7.4 Combiner Optics and Exit Pupil Expansion

8 Invariants in HMD Optical Systems, and Strategies to Overcome Them
8.1 Mechanical IPD Adjustment
8.2 Pupil Expansion
8.3 Exit Pupil Replication
8.4 Gaze-Contingent Exit Pupil Steering
8.5 Exit Pupil Tiling
8.6 Gaze-Contingent Collimation Lens Movement
8.7 Exit Pupil Switching

9 Roadmap for VR Headset Optics
9.1 Hardware Architecture Migration
9.2 Display Technology Migration
9.3 Optical Technology Migration

10 Digital See-Through VR Headsets

11 Free-Space Combiners
11.1 Flat Half-Tone Combiners
11.2 Single Large Curved-Visor Combiners
11.3 Air Birdbath Combiners
11.4 Cemented Birdbath–Prism Combiners
11.5 See-Around Prim Combiners
11.6 Single Reflector Combiners for Smart Glasses
11.7 Off-Axis Multiple Reflectors Combiners
11.8 Hybrid Optical Element Combiners
11.9 Pupil Expansion Schemes in MEMS-Based Free-Space Combiners
11.10 Summary of Free-Space Combiner Architectures
11.11 Compact, Wide-FOV See-Through Shell Displays

12 Freeform TIR Prism Combiners
12.1 Single-TIR-Bounce Prism Combiners
12.2 Multiple-TIR-Bounce Prism Combiners

13 Manufacturing Techniques for Free-Space Combiner Optics
13.1 Ophthalmic Lens Manufacturing
13.2 Freeform Diamond Turning and Injection Molding
13.3 UV Casting Process
13.4 Additive Manufacturing of Optical Elements
13.5 Surface Figures for Lens Parts Used in AR Imaging

14 Waveguide Combiners
14.1 Curved Waveguide Combiners and Single Exit Pupil
14.2 Continuum from Flat to Curved Waveguides and Extractor Mirrors
14.3 One-Dimensional Eyebox Expansion
14.4 Two-Dimensional Eyebox Expansion
14.5 Display Engine Requirements for 1D or 2D EPE Waveguides
14.6 Choosing the Right Waveguide Coupler Technology
     14.6.1 Refractive/reflective coupler elements
     14.6.2 Diffractive/holographic coupler elements
     14.6.3 Achromatic coupler technologies
     14.6.4 Summary of waveguide coupler technologies

15 Design and Modeling of Optical Waveguide Combiners
15.1 Waveguide Coupler Design, Optimization, and Modeling
     15.1.1 Coupler/light interaction model
     15.1.2 Increasing FOV by using the illumination spectrum
     15.1.3 Increasing FOV by optimizing grating coupler parameters
     15.1.4 Using dynamic couplers to increase waveguide combiner functionality
15.2 High-Level Waveguide-Combiner Design
     15.2.1 Choosing the waveguide coupler layout architecture
     15.2.2 Building a uniform eyebox
     15.2.3 Spectral spread compensation in diffractive waveguide combiners
     15.2.4 Field spread in waveguide combiners
     15.2.5 Focus spread in waveguide combiners
     15.2.6 Polarization conversion in diffractive waveguide combiners
     15.2.7 Propagating full-color images in the waveguide combiner over a maximum FOV
     15.2.8 Waveguide-coupler lateral geometries
     15.2.9 Reducing the number of plates for full-color display over the maximum allowed FOV

16 Manufacturing Techniques for Waveguide Combiners
16.1 Wafer-Scale Micro- and Nano-Optics Origination
     16.1.1 Interference lithography
     16.1.2 Multilevel, direct-write, and grayscale optical lithography
     16.1.3 Proportional ion beam etching
16.2 Wafer-Scale Optics Mass Replication

17 Smart Contact Lenses and Beyond
17.1 From VR Headsets to Smart Eyewear and Intra-ocular Lenses
17.2 Contact Lens Sensor Architectures
17.3 Contact Lens Display Architectures
17.4 Smart Contact Lens Fabrication Techniques
17.5 Smart Contact Lens Challenges

18 Vergence-Accommodation Conflict Mitigation
18.1 VAC Mismatch in Fixed-Focus Immersive Displays
     18.1.1 Focus rivalry and VAC
18.2 Management of VAC for Comfortable 3D Visual Experience
     18.2.1 Stereo disparity and the horopter circle
18.3 Arm's-Length Display Interactions
18.4 Focus Tuning through Display or Lens Movement
18.5 Focus Tuning with Micro-Lens Arrays
18.6 Binary Focus Switch
18.7 Varifocal and Multifocal Display Architectures
18.8 Pin Light Arrays for NTE Display
18.9 Retinal Scan Displays for NTE Display
18.10 Light Field Displays
18.11 Digital Holographic Displays for NTE Display

19 Occlusions
19.1 Hologram Occlusion
19.2 Pixel Occlusion, or "Hard-Edge Occlusion"
19.3 Pixelated Dimming, or "Soft-Edge Occlusion"

20 Peripheral Display Architectures

21 Vision Prescription Integration
21.1 Refraction Correction for Audio-Only Smart Glasses
21.2 Refraction Correction in VR Headsets
21.3 Refraction Correction in Monocular Smart Eyewear
21.4 Refraction Correction in Binocular AR Headsets
21.5 Super Vision in See-Through Mode

22 Sensor Fusion in MR Headsets
22.1 Sensors for Spatial Mapping
     22.2.1 Stereo cameras
     22.2.2 Structured-light sensors
     22.2.3 Time-of-flight sensors
22.3 Head Trackers and 6DOF
22.4 Motion-to-Photon Latency and Late-Stage Reprojection
22.5 SLAM and Spatial Anchors
22.6 Eye, Gaze, Pupil, and Vergence Trackers
22.7 Hand-Gesture Sensors
22.8 Other Critical Hardware Requirements



This book is a timely review and analysis of the various optical architectures, display technologies, and optical building blocks used today for consumer, enterprise, or defense head-mounted displays (HMDs) over a wide range of implementations, from smart glasses and smart eyewear to augmented-reality (AR), virtual-reality (VR), and mixed-reality (MR) headsets.

Such products have the potential to revolutionize how we work, communicate, travel, learn, teach, shop, and get entertained. An MR headset can come in either optical see-through mode (AR) or video-pass- through mode (VR). Extended reality (XR) is another acronym frequently used to refer to all declinations of MR.

Already, market analysts have very optimistic expectations on the return on investment in MR, for both enterprise and consumer markets. However, in order to meet such high expectations, several challenges must be addressed. One is the use case for each market segment, and the other one is the MR hardware development.

The intent of this book is not to review generic or specific AR/VR/MR use cases, or applications and implementation examples, as they have already been well defined for enterprise, defense, and R&D but only extrapolated for the burgeoning consumer market. Instead, it focuses on hardware issues, especially on the optics side. Hardware architectures and technologies for AR and VR have made tremendous progress over the past five years, at a much faster pace than ever before. This faster development pace was mainly fueled by recent investment hype in start-ups and accelerated mergers and acquisitions by larger corporations.

The two main pillars that define most MR hardware challenges are immersion and comfort. Immersion can be defined as a multisensory perception feature (starting with audio, then display, gestures, haptics, etc.). Comfort comes in various declinations:

  • wearable comfort (reducing weight and size, pushing back the center of gravity, addressing thermal issues, etc.),
  • visual comfort (providing accurate and natural 3D cues over a large FOV and a high angular resolution), and
  • social comfort (allowing for true eye contact, in a socially acceptable form factor, etc.).

In order to address in an effective way both comfort and immersion challenges through improved hardware architectures and software developments, a deep understanding of the specific features and limitations of the human visual perception system is required. The need for a human-centric optical design process is emphasized, which would allow for the most comfortable headset design (wearable, visual, and social comfort) without compromising the user's immersion experience (display, sensing, interaction). Matching the specifics of the display architecture to the human visual perception system is key to reducing the constraints on the hardware to acceptable levels, allowing for effective functional headset development and mass production at reasonable costs.

The book also reviews the major optical architectures, optical building blocks, and related technologies that have been used in existing smart glasses, AR, VR, and MR products or could be used in the near future in novel XR headsets to overcome such challenges. Providing the user with a visual and sensory experience that addresses all aspects of comfort and immersion will eventually help to enable the market analysts' wild expectations for the coming years in all headset declinations.

The other requirement, which may even be more important than hardware, is contingent on the worldwide app-developer community to take full advantage of such novel hardware features to develop specific use cases for MR, especially for the consumer market.

Bernard Kress
December 2019

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?