Proceedings Volume 10676

Digital Optics for Immersive Displays

cover
Proceedings Volume 10676

Digital Optics for Immersive Displays

Purchase the printed version of this volume at proceedings.com or access the digital version at SPIE Digital Library.

Volume Details

Date Published: 2 August 2018
Contents: 6 Sessions, 44 Papers, 6 Presentations
Conference: SPIE Photonics Europe 2018
Volume Number: 10676

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Front Matter: Volume 10676
  • Optical Challenges for Next-generation AR/VR headsets
  • Design, Fabrication and Testing of Novel Optics for AR/VR systems
  • Holographic Optics for AR/VR Systems
  • Improving visual comfort in AR/VR systems
  • DOID Student Optical Design Challenge for VR/AR and MR: Poster Presentations
Front Matter: Volume 10676
icon_mobile_dropdown
Front Matter: Volume 10676
This PDF file contains the front matter associated with SPIE Proceedings Volume 10676, including the Title Page, Copyright information, Table of Contents, and Conference Committee listing.
Optical Challenges for Next-generation AR/VR headsets
icon_mobile_dropdown
Field of view: not just a number
Near-eye display users universally request larger fields of view for enhanced immersion, presence, and device utility. Unlike frame rate or device weight, field of view cannot be represented precisely as a single number. Quoting field of view as a diagonal, a carry-over from the display industry, could refer to either the monocular or stereo field of view and gives no indication of the field of view boundary shape. This work defines an unambiguous metric evaluation of field of view based on solid angle, accounting for eye relief, interpupillary distance, eye rotation, and device alignment. The approach allows optical system designers to identify weak points in the optics/display/rendering pipeline. To accompany modeling, a measurement scheme was developed to metrically compare field of view over various real-world user conditions. Best practices for visualizing and communicating field of view are also presented. This work reviews the methods used to increase field of view, with discussion of the monocular and binocular artifacts that arise in large field of view systems. The limitations and advantages of optical tiling, canting, and extreme distortion are described, using relevant examples in the commercial VR space. The fundamental tradeoffs between resolution, field of view, and optical quality over field are discussed, including a review of methods to maximize field of view without sacrificing on-axis resolution. Until display and optics technology can fully match the human visual system, the intermediate objective is to find the best experience match in field of view, resolution, and optical quality given existing hardware limitations. Qualitative assessments of the relative value of different regions of the human visual field will be provided.
Optical design challenges from satellite imaging to augmented reality displays
The design challenge of an optical system is not limited to the optical design itself and its performance in nominal conditions; it extends to the realisation of a system which includes tolerancing and manufacturability as well as meeting system requirements such power consumption, heat dissipation, mass, cost constraints and timescales. In this presentation, optical designs for different applications ranging from satellite optical imager to Augmented Reality Displays are presented to illustrate the different challenges that an optical design needs to address. Augmented Reality will prove to be a very useful tool in our everyday lives; it brings elements of the virtual world into the real world, enhancing what we see, hear, and feel. With this comes a high demand for compact, light, affordable and high-quality displays.
Viewing optics for immersive near-eye displays: pupil swim/size and weight/stray light
Near-eye display performance is usually summarized with a few simple metrics such as field of view, resolution, brightness, size, and weight, which are derived from the display industry. In practice, near-eye displays often suffer from image artifacts not captured in traditional display metrics. This work defines several immersive near-eye display metrics such as gaze resolution, pupil swim, image contrast, and stray light. We will discuss these metrics and their trade-offs through review of a few families of viewing optics. Fresnel lenses are used in most commercial virtual reality near-eye displays in part due to their light weight, low volume and acceptable pupil swim performance. However, Fresnel lenses can suffer from significant stray light artifacts. We will share our measurements of several lenses and demonstrate ways to improve performance. Smooth refractive lens systems offer the option for lower stray-light viewing but usually at the cost of a much larger size and weight in order to get to the same pupil swim performance. This can be addressed by using a curved image plane but requires new display technology. Polarization-based pancake optics is promising and can provide excellent image resolution and pupil swim performance within an attractive form-factor. This approach, however, generally results in low light efficiency and poor image contrast due to severe ghosting. We will discuss some of the main limitations of that technology.
Design, Fabrication and Testing of Novel Optics for AR/VR systems
icon_mobile_dropdown
Ultra-compact multichannel freeform optics for 4xWUXGA OLED microdisplays
Marina Buljan, Bharathwaj Narasimhan, Pablo Benítez, et al.
We present an advanced optical design for a high-resolution ultra-compact VR headset for high-end applications based on multichannel freeform optics and 4 OLED WUXGA microdisplays developed under EU project LOMID [1]. Conventional optical systems in VR headsets require large distance between lenses and displays that directly leads to the rather bulky and heavy commercial headsets we have at present. We managed to dramatically decrease the required display size itself and the display to eye distance, making it only 36 mm (to be compared to 60-75 mm in most conventional headsets). This ultra-compact optics allows reducing the headset weight and it occupies about a fourth of volume of a conventional headset with the same FOV. Additionally, our multichannel freeform optics provides an excellent image quality and a large field of view (FOV) leading to highly immersive experience. Unlike conventional microlens arrays, which are also multichannel devices, our design uses freeform optical surfaces to produce, even operating in oblique incidences, the highest optical resolution and Nyquist frequency of the VR pixels where it is needed. The LOMID microdisplays used in our headsets are large-area high-resolution (WUXGA) microdisplays with compact, high bandwidth circuitry, including special measures for high contrast by excellent blacks and low-power consumption. LOMID microdisplay diagonal is 0.98” with 16:10 aspect ratio. With two WUXGA microdisplays per eye, our headset has a total of 4,800x1,920 pixels, i.e. close to 5k. As a result, our multichannel freeform optics provides a VR resolution 24 pixels/deg and a monocular FOV of 92x75 degs (or 100x75 with a binocular superposition of 85%).
Casting technology for embedding optical elements into prescription spectacle lenses
D. Muff, L. Körner
Interglass Technology AG has developed a novel lens casting technology which enables embedding of functional elements into prescription spectacle lenses. Besides the important correction of ametropia, the technology provides secure protection of sensitive optical elements. Volume holograms in functional films might be used as optical combiners in augmented reality (AR) and mixed reality (MR) headsets. However, holographic diffraction gratings are susceptible to excessive heat and distortions. Therefore, mild process conditions are essential for the successful incorporation of holographic films into spectacle lenses. This study compares the standard lens casting process, which is normally used for lenses without functional films, with an optimized process, which allows better control of polymerization rate and reaction temperature. Evaluation of peak positions and diffraction efficiencies shows that both processes preserve full hologram functionality over the tested range of spherical powers. However, in terms of film adhesion, there is an obvious difference between the two processes. While the standard process leads to delaminations between film and polymer matrix during curing or lens edging, the optimized process yields reproducibly flawless lenses with sufficient adhesion. Furthermore, the effect of surface activation and lens thickness on film adhesion is analyzed. Testing analogous to ISO 15024 shows that adhesion evolution follows chemical kinetics, i.e. higher reaction temperatures during polymerization result in higher critical energy release rates. A possible explanation for this behavior is the establishment of covalent bonds between the activated film and the evolving polymer matrix.
Optical metrology for immersive display components and subsystems
Optical systems for immersive displays incorporate a range of optical components and assemblies that require precision non-contact metrology, including Fizeau interferometry of surface form, new techniques for aspheric microlenses, and interference microscopy for surface structure and texture analysis. Here we consider the problem of evaluating the parallelism and surface form deformation for stacked assemblies of multiple flat glass substrates. Similar structures are common for RGB planar waveguides with slanted sub-wavelength gratings acting as in- and out-couplers. In our experiments, we demonstrate the effectiveness of coherence scanning over a large aperture area using an 100-mm aperture white-light interferometer.
HMD quality evaluation of projected image: hardware assessment and software evaluation for distortions correction
Thomas Miletti, Nicola Truant, Entela Gurabardhi, et al.
Studies on Head Mounted Displays (HMDs) to integrate increased sensory capacities to the wearer are growing fast over the years for applications in Augmented (AR), Virtual (VR) and Mixed Reality (MR). In this work, we focus our attention on the characterisation of the projected image from the Optical Module (OM) equipped on board of “F4” model (AR mask for industrial users) produced by GlassUp. We have used our own system to test the quality assessment process, but it can be applied also to other kinds of OM. The major difference between real eye and the emulated one is that in the first one the projected image is elaborated as a continuum by our brain while in the second one the acquiring detector is discretised in pixels. After a proper resize, the images acquired by the detector (1900x1200 px) can be analysed with respect to the original images used as input for the display in the OM (640x480 px). Based on the Structural Similarity (SSIM) theory, we propose the definition of a new index (F-SSIM) to extract more reliable information on the quality of projected images. This approach can be used both for hardware validation trial and evaluation of digital corrections for the pincushion and vignetting optical distortions. The quality assessments proposed in this work define an innovative resize approach of the reference and acquired images for an optimal structural similarity comparison. The results for F-SSIM and SSIM analysis are compared and discussed.
Holographic Optics for AR/VR Systems
icon_mobile_dropdown
Curved wedges and shearing gratings for augmented reality
A. R. L. Travis, Jiaqi Chu, Andreas Georgiou
Designer spectacles look great and we want the same for a virtual display. A curved wedge guide will be described that can transfer the virtual image from a projector near the ear round to the pupil of the eye. The eye-box is tiny but the plan is to steer it so as to follow the pupil of the eye and this will be done by shearing the holographic combiner. Ray-tracing predicts a field of view of 115° per eye and a resolution of 2000 pixels per radian at the fovea using pre-distortion in the projector. Guide tolerances are lax, image accommodation is variable and a few milliWatts suffice to steer the pupil.
Characterisation and optimisation of Volume Holographic Optical Elements (VHOEs) in AR combiners for ghost reduction
Marco Francardi, Nicola Truant, Enzo Francesca, et al.
Studies to integrate increased sensory capacities of the wearer on head mounted displays are growing fast for applications in augmented, virtual and mixed reality. This paper focuses on the characterisation of the combiner of the UNO augmented reality glasses by GlassUp: a volume holographic optical element, recorded at 532 nm on Bayfol® HX, providing high diffraction efficiency over a narrow bandwidth at the desired angles. Furthermore, there is a negligible impact on the head mounted display frontal lens’s width and on the view of the real world. Studying the volume hologram, we developed a characterisation tool that gave us insight into the patterns inside the polymer and served as quality check for UNO. In particular, we designed a setup that provides Total Angular Characterisation through Optical Spectroscopy (TACOS), acquiring the visible spectral response of VHOEs as transmittance, reflectance or dispersion maps at arbitrary angles. Furthermore, it analyses the patterns through a custom fit based on the Kogelnik model. The early stages volumetric holograms presented undesirable secondary efficiency peaks causing ghosts, impairing both the vision of the real world and the image projected by the optical module. TACOS’ custom fit allowed us to conduct a quantitative analysis on the ghosts. In this work, we present an equilibrium study on all these parameters to find the better condition of projection. Moreover, we studied the impact of the recording parameters on the appearance of ghosts, e.g. tuning exposure, power ratios.
Bragg polarization gratings used as switchable elements in AR/VR holographic displays
Oksana Sakhno, Yuri Gritsai, Hagen Sahm, et al.
The high coherence of laser light sources is a key to the application of diffractive optics usable in holographic AR/VR displays. This can be combined with switchable diffractive elements, which are advantageous for several optical functions used in immersive holographic displays such as shutters, polarization filters, for rapid beam deflection and selection. We demonstrate a compact, effective and robust diffraction wide-angle switchable beam-deflecting device based on circular polarization gratings possessing Bragg-performances (Bragg-PG) and a polarization switch. Such grating/polarization switch pair may, for instance, be a discrete switchable deflection element or as a switching element for pre-deflection with field lenses for application in holographic AR/VR displays. Micrometer-thick circular polarization gratings characterized by high diffraction efficiency (DE > 95%), large diffraction angles (< 30°) and wide angular and wavelength acceptance were developed. In the presented embodiment, the output signal is controlled between the zero- and first-diffraction orders by the handedness of circular polarization of the incident light. Forming a stack of two such oppositely aligned gratings can double the deflection angle. These gratings are the result of a two-step photochemichal/thermal processing procedure of a photocrosslinkable liquid crystalline polymer (LCP). The holographic patterning provides a high spatial resolution (period < 700 nm) and the arbitrary orientation of the LC director as well as high optical quality and thermal and chemical stability of the final gratings. Highly efficient (diffraction efficiency, DE > 95% in the vis spectral range) and stable symmetric and slanted circular Bragg polarization gratings were fabricated using the developed material and processing technique. The high usable diffraction angles combined with high DE make the Bragg-PG attractive for HMD AR/VR applications because of the system inherent short focus lengths and large numerical apertures needed to meet the low space budget in HMD and other optical systems.
DigiLens switchable Bragg grating waveguide optics for augmented reality applications
DigiLens’s Switchable Bragg Grating (SBG) waveguides enable switchable, tunable and digitally reconfigurable color waveguide displays with a field of view, brightness and form factor surpassing those of competing technologies. DigiLens waveguides can be laminated to integrate multiple optical functions into a thin transparent device. DigiLens waveguide gratings are printed into a proprietary polymer and liquid crystal mixture that can provide any required combination of diffraction efficiency and angular bandwidth in a thin waveguide with high transparency and very low haze. The waveguide combines two key components: an image generation module, essentially a pico projector, and a holographic waveguide for propagating and expanding the image vertically and horizontally. Color is provided by a stack of monochrome waveguides each capable of addressing the entire field of view, incorporating an input rolled K-vector grating, a fold grating, and an output grating. Rolling the K-vectors expands the effective angular bandwidth of the waveguide. Fold gratings enable two-dimensional beam expansion in a single waveguide layer, which translates into lower manufacturing cost, reduced haze, and improved image brightness. The design of these complex SBGs is complicated by their birefringent properties, taking the design of DigiLens waveguides well beyond the frontiers of established ray-tracing codes. Our paper summarizes the key features of DigiLens waveguide technology and discusses our optical design methodology, with examples from DigiLens’s current waveguide HUD products.
Wavelength multiplexing recording of vHOEs in Bayfol HX photopolymer film
Friedrich-Karl Bruder, Sven Hansen, Christel Manecke, et al.
Photopolymer films (Bayfol® HX) have been recently introduced into the market place and prove themselves as easy to process for volume holographic optical element (vHOE) recording. The new Bayfol® HX instant developing holographic photopolymer film provides full color capability and adjustable diffraction efficiency as well as an unprecedented optical clarity when compared to classical volume holographic recording materials like silver halide emulsions (AgHX) or dichromated gelatin (DCG). Besides the recording step no pre- or post-processing is necessary and easy mass production of vHOEs in a completely dry roll to roll process is possible. Due to the nature of vHOEs, multiplexing recording can be used to superimpose multiple optical functions in a single layer. This enables to merge angular wise precise full color diffractive combiner optics in one layer by spectral multiplexing, like RGB recording in Bayfol® HX film. Further optical sensing functions may be added by additional angular multiplexing. For reflection type vHOE recording the necessity of additional partial reflective layers or using total internal reflection (TIR) in a light guide becomes obsolete. Obviously, these unique properties of vHOEs could significantly simplify the layer structure of immersive displays like in Head-Mounted-Displays (HMD) and Head-Up- Displays (HUD). In this paper we investigate and demonstrate wavelength multiplexing recording in Bayfol® HX film with a specific focus on the design of the optical recording setup and its system and stability margins. Well controlled RGB recording power conditions enable high repeatability over extended operation periods of the RGB efficiency balance of reflective vHOEs.
Resonant screens focus on the optics of AR (Conference Presentation)
The recent new development of Meta Resonant Waveguide-Gratings (RWG) allows adapting the unique optical properties of Resonant Waveguide-Grating away from the specular reflection and direct transmission, in which they were confined in the last 35 years. Exploiting Wood's second type of anomalies, the so-called resonance anomalies in meta-devices based-on horizontal guided-mode resonances (GMR) create excellent transparency out of the resonance condition. Meta RWG can be engineered to exhibit highly selective and tunable optical properties to realize as examples monochromatic diffraction gratings, monochromatic metalenses and meta-couplers, while their nanostructures are made very compact and with flat aspect-ratio. Examples of new optical combiners implementation for augmented and mixed reality will be presented. A roadmap for future developments will be sketched as well as their advantages and limitations, for which they will be compared to surface relief grating, volume holograms and visible-light metasurfaces. The author will focus on the implementation of Meta-RWG in augmented and mixed reality systems addressing the many challenges of such see-through near-eye display system, such as large field of view, large eye-box with a sufficient eye-relief distance, high transparency, high compactness and low-weight, high pixel angular resolution as well as a mitigated vergence accommodation conflict or a non single/fix focal plane architecture.
Improving visual comfort in AR/VR systems
icon_mobile_dropdown
Varifocal technologies providing prescription and VAC mitigation in HMDs using Alvarez lenses
R. E. Stevens, D. P. Rhodes, A. Hasnain, et al.
We present a varifocal system for generating consistent accommodation cues and providing prescription correction in Virtual Reality Head-Mounted Displays (VR HMDs). The proposed approach mitigates the Vergence-Accommodation Conflict (VAC), a fundamental cause of discomfort in today’s VR, and eliminates the need for corrective eyeglasses inside head-mounted displays. We augment traditional objective lenses with a focus-adjustable optical system based on Alvarez lenses, and demonstrate a proof-of-concept integration into a commercial mobile VR headset. This paves the way to lighter, thinner, and more comfortable headsets, enabling the prolonged use of VR with minimal visual discomfort.
Computationally efficient and antialiased dual-layer light-field displays
Konstantin Kolchin, Gleb Milyukov, Sergey Turko, et al.
Factored light-field (LF) technology helps resolving the vergence-accommodation conflict inherent to the most of conventional stereoscopic displays. The remaining challenges include decreasing computation cost of light-field factorization and improving image quality. We prototyped a dual-layer light-field stereoscope with a smartphone used as a display. We implement and compare three different methods of rank-one LF factorization and two ways of initializing them. The weighted rank-one residual iterations (WRRI) and the weighted nonnegative matrix factorization (WNMF) proved almost twice faster than Huang et al.’s method in our implementation. Our tests revealed that the best way of initialization for all the three methods is that by the square root of the LF central view values; namely, one-two iterations are enough to achieve acceptable image quality.
Experimental evaluation of self-focusing image formation in unconventional near-eye display
We demonstrate experimental evaluation of self-focusing effect used for image formation in an unconventional near-eye display. The impact of the spectral bandwidth of the light source used to project the image is investigated in an experimental set-up and through multiple interference simulations. It shows that the self-focusing effect is robust and does not require a highly coherent laser source. Simulations are conducted with first experimental holographic recording data and show that our concept can be implemented with a LED array as primary source. Considerations for intraocular image formation quality will be further discussed during the conference.
Visual comfort is key to broad MR hardware acceptance (Conference Presentation)
We review the various display engine technologies and subsequent optical architectures used today in AR and see-through MR headsets to enable the best possible three-dimensional immersion experience for the user. Special consideration will be given to the degree of adequacy of the display architecture to the human perceptive system in order to increase visual comfort for prolonged usage. Wearable comfort issues are also addressed as key issues for broad acceptance, and range from size, weight, center of gravity to thermals, fabrics and more.
DOID Student Optical Design Challenge for VR/AR and MR: Poster Presentations
icon_mobile_dropdown
Improving image quality of 360-degree viewable holographic display system by applying a speckle reduction technique and a spatial filtering
Yongjun Lim, Keehoon Hong, Hayan Kim, et al.
Generally in electronic holographic display systems, coherent light sources are used to reconstruct holograms. The random distribution of phase profile of an object image causes unwanted dark and bright spots to degrade reconstructed hologram images. In addition, a periodic structure of available spatial light modulators such as liquid crystal on silicon devices and digital micro-mirror devices generates various diffractive signals when they are illuminated by coherent light sources. Consequently, it is necessary to select a proper signal band in spatial frequency domain by effectively filtering out unwanted signals. In this paper, the speckle pattern in a table-top holographic display system is measured and the method for reducing the speckle patterns is to be shown.
Design of a freeform gradient-index prism for mixed reality head mounted display
Freeform prism systems are commonly used for head mounted display systems for augmented, virtual, and mixed reality. They have a wide variety of applications from scientific uses for medical visualization to defense for flight helmet information. The advantage of the freeform prism design over other designs is their ability to have a large field of view and low f-number while maintaining a small and light weight form factor. Current designs typically employ a homogeneous material such as polymethyl methacrylate (PMMA). Using a GRIN material gives the designer extra degrees-of-freedom by allowing a variable material refractive index within the prism. The addition of the GRIN material allows for light to bend within the material instead of only reflecting off the surfaces. This work looks at implementing a freeform gradient-index (GRIN) into a freeform prism design to improve performance, increase field of view (FOV), and decrease form factor by the use of 3D printable polymers. A prism design with freeform GRIN is designed with a FOV of 45°, eye relief of 18.25 mm, eyebox of 8 mm, and performance greater than 10% at 50 lp/mm.
Optical design, assembly, and characterization of a holographic head mounted display
A. Gärtner, R. Häussler, B. Fleck, et al.
We present the development and investigation of a holography-based head mounted display (HMD) that uses the proprietary Viewing-Window (VW) technology of SeeReal Technologies to generate a holographic scene. Considering various specification requirements, such as field of view (FOV) and resolution, the development of the HMD system was implemented by means of the optical design software Zemax. A prototypical HMD was set up in the laboratory and several tests were conducted e.g. resolution limit, field of view, and spatial resolution of the holographic reconstruction to investigate the performance. The HMD system reaches a resolution limit of 2.2 cycles/mm at a distance of 1500 mm between observer and image plane. The FOV is 4.1° in the horizontal and 2.3° in the vertical direction. By tuning the focus of a camera it was demonstrated that the holographic reconstruction is spatially resolved in three dimensions. Taking into account all technical specifications and restrictions, the image quality of the HMD was evaluated as good. Using holographic imaging technique has been demonstrated to be suitable to avoid the conflict between accommodation and convergence of the eye and is a promising way in future HMD technology.
Mitigating vergence-accommodation conflict for near-eye displays via deformable beamsplitters
David Dunn, Praneeth Chakravarthula, Qian Dong, et al.
Deformable beamsplitters have been shown as a means of creating a wide field of view, varifocal, optical see- through, augmented reality display. Current systems suffer from degraded optical quality at far focus and are tethered to large air compressors or pneumatic devices which prevent small, self-contained systems. We present an analysis on the shape of the curved beamsplitter as it deforms to different focal depths. Our design also demonstrates a step forward in reducing the form factor of the overall system.
Designing of a monocular see-through smart glass imaging system
Augmented reality systems are becoming very popular nowadays. One of the examples of such system is a monocular see-through smart glass display. Smart glasses can way simplify people’s life being. It can quickly find and visualize information, show a map and navigate you in a real time, and do many more different useful things. In such type of a system microdisplay is used as an image generator. We have used amlcd microdisplay with 640×480 resolution, 7.2×5.4 mm display size. This provides required angular pixel resolution of maximum 1.5 arcmin and diagonal field of view of minimum 20 degrees. To forward the image from microdisplay to the eye we have used several mirrors. We have considered a monocular see-through smart glass imaging system with the amlcd microdisplay as a source of imposed image. Thus, the main object of the research is to design and analyze an optical architecture for the monocular smart glass display and to provide required characteristics.
A reflective prism for augmented reality with large field of view
Prism-based augmented reality displays are affordable solutions for head mounted devices. In this paper we supply a seldom applied reflective prism form, which does not require rays to be total internal reflected (TIR) on the surfaces. Therefore, the different folding geometry leads to more freedom in the optical architecture and better performance. Furthermore, to develop the full-color property, an off-axis element is cemented to correct the chromatic aberration. For the specifications, the eye relief is 18mm and the field of view is 50° horizontally and 30° vertically, respectively.
Design of a spatially multiplexed light field display on curved surfaces for VR HMD applications
Tianyi Yang, Nicholas S. Kochan, Samuel J. Steven, et al.
A typical light field virtual reality head-mounted display (VR HMD) is comprised of a lenslet array and a display for each eye. An array of tiled subobjects shown on the display reconstructs the light field through the lenslet array, and the light field is synthesized into one image on the retina. In this paper, we present a novel compact design of binocular spatially multiplexed light field display system for VR HMD. Contrary to the flat lenslet array and flat display used in current light field displays, the proposed design explores the viability of combining a concentric curved lenslet array and curved display with optimized lenslet shape, size and spacing. The design of placing lenslet array on a spherical surface is investigated and the specification tradeoffs are shown. The system displays highest resolution at the direction wherever the eye gazes. The design form is thin and lightweight compared to most other VR optical technologies. Furthermore, the use of a curved display reduces the complexity of optical design and wastes fewer pixels between subobjects. The design simultaneously achieves a wide field of view, high spatial resolution, large eyebox and relatively compact form factor.
See-through smart glass with adjustable focus
Hossein Shahinian, Todd Noste, Nicholas Sizemore, et al.
The design proposed in this abstract, for a monocular see through smart glass, i.e. Design Challenge #1, leverages a varifocal lens to accommodate human eyes with different focusing abilities. The eyepiece is made of three separate segments. The varying focus of the system is achieved by using two freeform Alvarez surfaces. The Alvarez lens proposed here has the advantage of achieving different focal lengths by laterally shearing the optics with respect to each other. As a proof of concept, it is shown that the Alvarez lenses provide the ability of a constant performance for different eye conditions.
Ultrathin full color visor with large field of view based on multilayered metasurface design
Ori Avayu, Ran Ditcovski, Tal Ellenbogen
An augmented reality display system based on an ultra-thin see-through stacked metasurface near eye visor is proposed. We use the unique capabilities of plasmonic metasurfaces to control light at the subwavelength scale and design a see-through diffractive element that can project a full color image from a micro-display into the eye. This element is comprised of three metasurface layers, each designed to diffract only a specific wavelength and keep the rest of the visible spectrum unaffected. By implementing this layered design we can harness the advantages of diffractive optics and reduce their chromatic aberrations. We present here the design process of the proposed metasurface near eye visor and validate it by fabricating and testing a proof of concept sample.
A vergence accommodation conflict-free virtual reality wearable headset
When using a Virtual Reality Headset (VRH), fatigue headaches or even sight issues can quickly happen. In this article we present an optical design made for a Virtual Reality Headset that is free of any Vergence Accommodation Conflict (VAC) while still small enough to be worn. As we solve the VAC, the optical design is kept simple by moving the screen from the object focal point, sending the virtual image at infinity, to another point closer to the optics, bringing the virtual image at a selected distance. Although this solution was proven efficient in [1], they only studied a proof of principle and did not work on the optical design, so our work mainly consisted in the optimization of a wearable virtual reality head set.
Ultrathin optical combiner with microstructure mirrors in augmented reality
Using a waveguide as an optical combiner in optical see-through head-mounted displays has obvious advantages over other types of combiners due to its low-cost, light-weighted, ultra-thin and easily manufactured nature. In this paper, an ultra-thin, glasses-like augmented reality (AR) display system is presented. The design methods of the geometrical waveguide with microstructure mirrors and its freeform collimator are discussed. With freeform optics and ultra-thin waveguide configuration, this AR display can achieve 30° field of view (FOV) with an angular resolution of as high as 1.21 arcminutes.
Wide field-of-view waveguide displays enabled by polarization-dependent metagratings
We proposed a waveguide display design based on polarization dependent metagratings. By encoding the left and right half of field of view (FOV) in two orthogonal polarization channels, we achieved an overall horizontal FOV of 67° at 460 nm using a single waveguide, which is 70% larger than that achieved with conventional diffractive gratings. Metagratings that selectively diffract out TE or TM polarized light are designed and simulated using rigorous coupled wave analysis (RCWA). High polarization selectivity is achieved, with minimal crosstalk between the two channels. The transmission spectrum at normal incidence is calculated to assess the see-through effect. Remaining challenges such as fabrication and efficiency issues are discussed. The concept of multiplexing information in the polarization domain enables wide FOV waveguide displays for future AR devices.
Over-designed and under-performing: design and analysis of a freeform prism via careful use of orthogonal surface descriptions
In this paper, two freeform prism combiner designs with different geometries were studied. The first design, whose geometry is driven by the need for total-internal-refraction, achieves optical performance suitable for use in AR/VR applications, but involves highly complex surfaces and highly non-uniform performance. The second design, which removes the total-internal-refraction requirement, adopts a modified geometry which enables significantly improved aberration correction potential. The nodal-aberration-theory based design process is shown for both prism designs, and the optical performance of each design was analyzed. Performance exceeds 10% MTF at 50lp/mm over centered and decentered 3mm effective subpupils, evaluated at nine different positions within an 8mm diameter eyebox.
Shape scanning displays: tomographic decomposition of 3D scenes
Seungjae Lee, Youngjin Jo, Dongheon Yoo, et al.
Although there have been a desire to implement ideal three-dimensional (3D) displays, it is still challenging to satisfy commercial demands in resolution, depth of field, form factor, eye-box, field of view, and frame rate. Here, we propose shape scanning displays that may have extremely large depth of field (10cm-infinity) without loss of frame rate or resolution, and enough eye-box (7.5mm) with moderate field of view (30°). Furthermore, our prototype provides quasi-continuous focus cues as well as motion parallax by reconstruction of 120 tomographic layers. Shape scanning displays consist of a tunable lens, a display panel, and a spatially adjustable backlight. The synchronization of the tunable lens and spatially adjustable backlight could provide additional dimension of depth information. In summary, we introduce a novel 3D display technology called shape scanning displays that present superior performance in resolution, depth of field, and focus cue reproduction. This approach has a lot of potential to be applied for various field in 3D displays including head-up displays, tabletop displays, as well as head-mounted displays. It could be efficient solution for vergence-accommodation conflict as providing accurate focus cues.
Polarization-dependent metasurfaces for 2D/3D switchable displays
Zhujun Shi, Federico Capasso
We proposed a 2D/3D switchable display design based on polarization-dependent metasurfaces. Metasurfaces are ultrathin planar optical devices patterned with subwavelength nanostructures. We design the metasurfaces such that they can simultaneously deflect right-hand circularly polarized (RCP) light to an angle and transmit left-hand circularly polarized (LCP) light to the normal direction. Combined with an active polarization rotator, the device can be switched between high resolution 2D display mode and multiview 3D display mode. Proof-of-principle metasurface designs are demonstrated. The far field radiation patterns in the 2D and 3D mode are simulated and analyzed. The effects of spectral bandwidth and beam directionality are also discussed. Compared with liquid crystal lenses, which is the key element in previous 2D/3D switchable displays, metasurfaces 1) deliver more precise phase profile control, thus less aberrations and higher image quality; 2) offer additional degrees of freedom in polarization manipulation; and 3) can be adapted to much smaller sizes.
High-performance integral-imaging-based light field augmented reality display
Hekun Huang, Hong Hua
A new design of a head-mounted optical see-through light field display based on integral imaging is proposed to achieve both a wide see-through view and a high resolution light field display over a large depth volume. The design, which incorporates custom-designed freeform optics, a tunable lens and an aperture array, offers a true 3D display view of 30° by 18° in horizontal and vertical directions, respectively, and a crosstalk-free eyebox of 6mm by 6mm. Owning to the capability of dynamically tuning the position of the central depth plane using the tunable lens, the virtual display is able to render the light field of a true 3D scene and maintains the spatial resolution of 3 arc minutes across a depth range of over 3 diopters. Due to the unique design of the freeform eyepiece, the see-through optics provides a field of view of 65° by 40° with an angular resolution as high as 0.5 arc minutes and very low distortion to the see-through view.
Design and stray light analysis of a lenslet-array-based see-through light-field near-eye display
This study proposes an optical-see-through light-field near-eye display (OST LF-NED) based on integral imaging (InI) using a discrete lenslet array (DLA). A light-field image is used as the image source. A special microdisplay array built on a transparent substrate is used as the screen. A DLA is used as a spatial light modulator (SLM) to generate dense light field of the 3-D scene inside the eyebox of the system and provide correct focus cues to the user. The key to realize the OST capacity is that the microdisplays and the lenslets are both discretely-arranged so that the light from the real world passes directly through the gaps among the microdisplays on the transparent substrate and then the flat portion on the DLA panel, providing a clear view of the real world as well as the virtual information. The stray light can be totally eliminated in the region of the eyebox in ideal situations. In practical situations, take the limitation of the F-number into consideration, a trade-off between the size of eyebox and the stray light is made. Analysis and simulation of the stray light are conducted in detail. A ring-shaped aperture on each lenslet is added to reduce the stray light significantly by blocking the screen light that passes by the outer edge of each lenslet. The simulation shows that the proposed method is capable of providing an OST view in LF displays.
High-resolution head mounted display using stacked LCDs and birefringent lens
Shuaishuai Zhu, Peng Jin, Wei Qiao, et al.
Head mounted displays (HMD) showed huge market potential in recent years. In these techniques, vergenceaccommodation conflict (VAC) is a fundamental problem which makes viewers feel discomfort and fatigue. To overcome this limitation, researchers proposed many solutions including Maxwellian view displays, vari-focal plane displays, multifocal plane displays, integral imaging-based displays, and computational multilayer displays. These techniques can enable correct or nearly correct focus cues, however, they failed to achieve both high image fresh rate and high lateral resolution with a compact architecture. In this paper, we propose a compact birefringent-based virtual reality (BVR) HMD with correct focus cues by spatially projecting the input images onto four depth planes. In the BVR, two stacked liquid crystal displays (LCDs) provide two axially separated input images in an additive fashion. We set a liquid crystal panel behind the LCDs to modulate the polarization of the emitting light from the LCDs pixel-wise. After that, a birefringent lens and an eyepiece project the modulated light onto four depth planes at 0D, 1D, 2D, and 3D. To minimize the astigmatism of the system, we employ a birefringent doublet with orthogonal optic axes and use an eyepiece to suppress the overall aberration. Comparing to the existing techniques, the proposed BVR mitigates the VAC problem with a compact architecture. Moreover, because there is no temporal multiplexing and lateral resolution sacrifice, the BVR can easily achieve high image refresh rate and high lateral resolution. Herein, we present the optical design of the BVR and characterize its performance in Zemax.
A retinal-projection-based near-eye display for virtual reality
Lantian Mi, Wenbo Zhang, Chao Ping Chen, et al.
We propose a retinal-projection-based near-eye display for virtual reality. Our design is highlighted by an array of tiled organic light-emitting diodes and a transmissive spatial light modulator. Its design rules are to be set forth in depth, followed by the results and discussion regarding the field of view, angular pixel resolution, stereoscopic vision, modulation transfer function, distortion, contrast ratio, simulated imaging, and industrial design.
Understanding waveguide-based architecture and ways to robust monolithic optical combiner for smart glasses
Vincent Brac de la Perrière
With the emergence of Augmented Reality (AR) and Virtual Reality (VR) headset during the past decade, firms and academic laboratory have worked on the design of optical combiners to increase the performances and form factor of the optical combiners. Most of the smart glasses on the market have the asset of being small, which eases the integration of the combiner in a head worn device. Most of them (Google glass, Vusix) use prism-like architecture, where the collimation and deflection of the light is performed by one single optical piece. This approach reduces the size and tolerance issues of the device. Other companies (Optinvent, Microsoft, Lumus) came with waveguide architecture, in which the light is collimated by a lens or group of lens, injected in a slab waveguide and extracted in front of the eye of the user. This way, the image is brought right in front of the eye, where prism like architecture displays an image in the peripheral sight of the user. These optical combiners however suffer from low tolerances and fabrication complexity as several pieces are combined. The injection and extraction of image rays in the waveguide can be performed either by holograms or slanted mirrors. Each technology has its downturns but for now the performances of holographic combiners were deceiving, resulting in chromatic dispersion and thus degradation of MTF. This paper relates the work on a waveguide-type optical architecture designed for smart glasses. The system described in this paper was conceived as a solution for smart glasses uses, for which the main concerns are the size of the eye box, adaptability, and a small form factor. Good optical performances were obtained, with a resolution of around 1.2px/arcmin, together with a large eye box.
Compact see-through AR system using buried imaging fiber bundles
S. Thiele, P. Geser, H. Giessen, et al.
This design concept is using multi-core imaging fiber bundles with small diameters (<350 μm) to transfer information from an image source (e.g. laser pico projector) to the eye of the user. One of the main benefits of this approach is that the resulting glasses are almost indistinguishable from conventional eyewear. Not only are the fiber bundles very thin and positioned close to the eye but their difference in refractive index compared to the surrounding medium is comparably small which makes them hardly visible. At the same time, they can carry a significant space-bandwidth product and may be easier to fabricate in comparison to similar solutions using waveguides or Fresnel-type extractors. Using ray tracing and wave-optical considerations, we show that such an approach can lead to highly inconspicuous AR glasses with a >20° diagonal field of view and good angular resolution.
Design of an immersive head mounted display with coaxial catadioptric optics
The contradiction between large field of view, big exit pupil and large eye relief in immersive virtual reality (VR) head- mounted displays limits the probability that such bulky device would be accepted by the most consumers. Typical of the wide field of view optical systems, Oculus CV1 adopts the Fresnel lens as the eyepiece to display the magnified virtual image. However, complexity in fabricating the Fresnel lens makes the commercial VR products with it cost-inefficient, especially when both surfaces employ the Fresnel surface. A wide field of view coaxial catadioptric system is proposed to serve as the magnified optics in this paper, which consists of a wire grid polarizer, two quarter-wave plates and a powered lens, making the optics thinner and fabrication easier. Note that coated on the surface of one quarter-wave plate is a partially reflecting coating. The powered lens takes the role of imaging, while the wire grid polarizer and the quarter- wave plates work together to adjust the polarization orientation so as to suppress direct transmission. The resulting monocular optical system yields the field of view as large as 110°, with virtual image exhibited 5 meters away from the observer. Exit pupil diameter and eye relief of the designed system are 9 mm and 15 mm, respectively. Considerable distortion will appear inevitably when field of view approaches 100°, which could be tackled via electronic correction. With evaluation in the CODEV, weight of the whole monocular optics is about 35 grams and the distance from the element near the observer to the image source is as thin as 30 mm, showing that the suggested design is superior to the ones in the form of singlet, doublet and the singlet using aspheric surface or Fresnel, in terms of the weight, compactness and fabricating cost. With this monocular optical system, an HMD binocular optical system can be achieved in the form of partially overlapping field of view. The present design offers a solution to acquiring the light-weight and small- volume VR optics and making cost of the commercial VR products acceptable to the public at the same time.
Ultra-Compact pancake optics based on ThinEyes super-resolution technology for virtual reality headsets
We present an advanced optical design for a high-resolution ultra-compact Virtual Reality headset based on the traditional pancake configuration following optical foveation in the following sense: Firstly, the magnification is variable along the FoV i.e. the VR pixel density is maximum at the center and gradually diminishes towards the edge. Secondly, the optics image quality is also adapted, so the MTF of any gazable field is best when it is directly gazed. The combination of both is designed to fit with the human visual acuity with normal eye movements, so the user does not perceive the lower peripheral resolution at the edges of the FoV. VR pixel resolution (i.e. pixels per degree) of traditional pancake configuration optics is limited to the geometrical distance of the different elements. However, we have broken that compromise by applying Limbak’s ThinEyes® superresolution technology to the design of four aspherical surfaces in a pancake-type configuration, so the VR pixel resolution is dramatically increased at the center of the virtual reality space while maintaining high FoV and excellent imaging quality across the FoV. We make use of a curved reflective polarizer, which could be done in practice by vacuum molding a polymeric one made out of birefringent multilayer technology (as DBEF of 3M). As an example, we present an optical system that uses a standard 2.85“square display, with a pixel pitch of 35.5 microns (1440x1440 pixels). The total track length (eye pupil to display distance) of the system is 36mm with 15 mm eye relief, so the lens thickness is only 21 mm. For a conventional optical design to achieve a higher VR pixel resolution of 24 pixels per degree, the compromise comes in the form of reduced FoV of around 85 deg. These drawbacks do not bound our take on the pancake configuration. Thanks to the optical foveation, this design achieves a focal length of 49 mm at the center of the FoV, resulting in an outstanding VR pixel resolution of 24 pixels per degree at the center with a circular FoV of 100 deg for a 10mm eyebox.
Solving the vergence-accomodation conflict in head mounted displays with a magnifier system
In this work we propose a binocular system based on a Galilean telescope used by persons with low-vision. First, a Galilean telescope is designed for a single eye. Then, once the Galilean telescope has been designed with an exit pupil size that matches with the pupil eye, the binocular system is then developed. Since the Galilean telescope is an afocal system, this allows for the eye to be relaxed, so, the use of a Galilean telescope is an option to mitigate the Vergence-Accomodation conflict, because it is a system that magnifies the image. The advantage with the proposed system is that the stereopsis allows an accurate accommodation and vergence.
Augmented reality display system for smart glasses with streamlined form factor
Samuel Steven, Yang Zhao, Greg Schmidt, et al.
A smart glass augmented reality (AR) display system is designed with a streamlined form factor featuring an off-axis mirror design. The main component of the combiner optics is in the shape of a regular pair of eyeglasses or sunglasses, with no diffractive gratings, waveguides (lightguides), prisms or Fresnel surfaces involved. High quality see-through performance is achieved with a low-cost combiner that consists of only highly manufacturable reflective surfaces. The 20-degree full field of view of the AR display is centered at about 30 degrees with respect to the center of the ocular vision. Such a design allows the user to have a clear unobscured central field of view. At the same time, the projected image is accessible by moving the eyeball off the central vision. The system is designed with a circular eye box with more than 10 mm in diameter.
High-resolution optical see-through vari-focal-plane head-mounted display using freeform Alvarez lenses
With recent developments in the manufacturing of freeform surfaces, Alvarez lenses have recently surfaced as an attractive method to achieve large focal ranges rapidly while still maintaining a compact structure. These characteristics make Alvarez lenses ideal for rendering correct focus cues in virtual/augmented reality (AR/VR), solving the vengeance- accommodation conflict. This paper presents a novel design combining a compact eyepiece with two lateral shifting freeform Alvarez lenses to create a compact, high-resolution, tunable optical see-through head-mounted display (OST- HMD) design capable of optical power shifts from ~ 0-3 diopters. Currently limited by the speed of the actuators that mechanically translates the Alvarez lenses for optical power tuning, the display system is capable of achieving the entire range of focus shift with an update rate of 50Hz. The proposed design renders near-accurate focus cues with high image quality and a large undistorted see-through field of view (FOV), utilizing an 1920x1080 color resolution organic LED (OLED) microdisplay to achieve virtual display FOV greater than 30 degrees diagonally, with an angular resolution less than 0.85 arcminutes per pixel and an average optical performance of > 0.4 contrast over the full field and a contrast above 0.2 at the Nquist frequency of 63 cycles/mm.
Super multi-view augmented reality glasses
A. Bolotova, A. Putilin, V. Druzhin
Nowadays, the main directions of augmented reality (AR) glasses development are: increasing of field of view (FoV) and eye-motion box; reducing weight of AR glasses; solving vergence-accomodation conflict. All these requirements should be obtained and combined with high image quality and decreasing dimensions of AR Glasses. We propose the optical system of AR glasses based on Schmidt Camera scheme for achievement of wide FoV and eye-motion box, and with using of Super Multi-View (SMV) technique for providing multifocal system. Provided optical design has huge benefits: eye motion box about 10 mm and field of view 60° and represents lightweight, eye fatigue free solution with low aberrations. Finally, our system has high opportunities for further modifications and improvements by using different image sources and projection system.
PARA: experimental device for virtual and augmented reality
The device we are presenting is a patented HMD with a custom stereo camera mounted on the front side. With a 110°field of view for both augmented and virtual realities, we perform live software-based distortions using different methods for the HMD and cameras lenses for real-time rendering and the lowest photon-to-pixel time. The reality and the virtual space are properly aligned thanks to real-time distortion methods. The device is also fully autonomous and can track its translation and rotation in known and unknown environments thanks to Simultaneous Location and Mapping with the cameras, having the option to perform dense 3D reconstruction.