Proceedings Volume 11350

Digital Optics for Immersive Displays II

cover
Proceedings Volume 11350

Digital Optics for Immersive Displays II

Purchase the printed version of this volume at proceedings.com or access the digital version at SPIE Digital Library.

Volume Details

Date Published: 14 April 2020
Contents: 5 Sessions, 11 Papers, 9 Presentations
Conference: SPIE Photonics Europe 2020
Volume Number: 11350

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Front Matter: Volume 11350
  • Digital Optics for AR and VR Systems
  • Digital Optics Fabrication and Testing for Immersive Displays
  • Digital Optics for 3D Imaging and 3D Display
  • 11350 Additional Presentations
Front Matter: Volume 11350
icon_mobile_dropdown
Front Matter: Volume 11350
This PDF file contains the front matter associated with SPIE Proceedings Volume 11350, including the Title Page, Copyright information, Table of Contents, Author and Conference Committee lists.
Digital Optics for AR and VR Systems
icon_mobile_dropdown
Effects of polarisation and spatial coherence in the pupil expansion with crossed gratings in an AR display
Choon How Gan, Marie-Elena Kleemann, Anna Golos, et al.
Pupil replication with crossed gratings in an AR display is modelled and characterised. It is found that, compared to linear gratings, crossed gratings in a hexagonal lattice offer an additional degree of freedom to control the angular spread of pupils and can potentially improve the uniformity of the pupil map. In terms of characterisation, the model explains well the trend observed in experiments. Good quantitative agreement of the relative pupil intensity is obtained for a number of the measured waveguides.
Design of volume holographic lenses for operation at 850nm in augmented reality eyewear applications
Jianbo Zhao, Jilian Nguyen, Benjamin D. Chrysler, et al.
Recent research in augmented reality (AR) eyewear has prompted interest in the use of volume holographic optical elements (VHOEs) for this application. This interest in VHOEs is due to a number of factors including: their formation in thin, lightweight films that can be deposited on a variety of substrate; high diffraction efficiency, transparency, and low scatter of the resulting elements; the ability to multiplex several elements in the same aperture; and the potential for mass production by using replication methods. However, a number of design issues must be taken into consideration when using VHOEs especially when used as input and output couplers that have optical power as required for AR eyewear. One such issue is the design of input and output couplers with optical power for use at wavelengths that differ from the construction wavelength. For instance, most photopolymers and dichromated gelating materials are sensitive in the blue-red (450-650 nm) wavelength range but not in the infrared (IR) (750-900nm) where sensing is desired for AR systems. Several methods have been suggested in the literature to address this problem for holographic lenses and vary in the degree of complexity. The problem of making holographic lenses for waveguide input and output couplers at different wavelengths is even more complex due to the need to exceed the critical angle for the construction beams. Fortunately, optical sensing functions frequently do not require high resolution, and this can be used to advantage in the design process. In this paper, a design method is presented that combines wavefront/diffraction efficiency optimization, nonsequential raytracing, and wavefront compensation to form waveguide couplers with an optical power that are formed with a construction wavelength of 532 nm and a reconstruction wavelength of 850 nm. The aberrations caused by Bragg mismatch and the contrast reduction introduced by ghost images are analyzed by simulation and experiment. The experimental results show that an image resolution of ~10 lp/mm can be achieved with the holographic lens with potential improvement to ~40 lp/mm by including a cylindrical lens in the reconstruction beams.
Double-pass HOE operation for compact AR glasses design
Authors propose approach for elimination of the aberrations and enlarging of the eye-box in the augmented reality (AR) wearable devices, based on point-to-point holographic optical element (HOE) lens combiner. This approach is based on the double-pass light propagation through the HOE. The compact design and experimental results are presented. Proposed approach allows archiving balance between compactness and virtual image parameters in AR glasses.
Wide-field-of-view augmented reality eyeglasses using curved wedge waveguide
Anastasiia Kalinina, Andrey Putilin
The purpose of the study is development of the wide field of view waveguide-based augmented reality system. For solving the issue, we examined the influence of curving and wedging of the waveguide on the field of view in augmented reality systems. We applied the known Q_U ray tracing method for calculation of the field of view in shaped waveguide-based systems and examined the image transferring through such systems. We found that shaped waveguide can transfer significantly wider field of view than planar waveguide. However, the image redirection can not be performed by standard techniques for light coupling. Consequently, we propose the 73° field of view waveguide-based system that consist of pico-projector, as image source, shaped waveguide for image transferring and holographic combiner for beams redirection to the eye pupil. The system represent a compact solution with wide field of view for integration in augmented reality devices.
Digital Optics Fabrication and Testing for Immersive Displays
icon_mobile_dropdown
Curved microdisplay, from optical design to mechanical study: impact on form-factor and light efficiency in visual systems
Form-factor and light efficiency are important issues Head-Mounted Displays face, since they both restrict their usage. Improving the form-factor means that for a defined visual stimulus, the system is smaller in volume. The light efficiency issue is linked to power consumption and time of use as well as the device’s ability to deliver, within a specific environment, enough luminance for the virtual image to be seen. This trade-off can also be found in imaging systems and Christophe Gaschet previously explored the optical design of onaxis imaging systems using curved sensors and particularly diopters number reduction thanks to Petzval shaped image plane. However, the behavior of an optical system changes dramatically when the design is off-axis. This paper focuses on demonstrating how using a curved microdisplay helps to improve the form-factor of a HMD system optimized using freeform optical design on a practical example. Curvature can also plays a great role in reducing the losses of light, but this imposes more constraints on the shapes to be given to the microdisplay. We discuss the trade-offs between these two advantages given by curved microdisplays. The mechanical feasibility of curved micro-displays will also be discussed, as well as the process to make a curved microdisplay, which is compatible with current mass-production CMOS displays. For OLED technology, the main resistance to curvature is the silicon substrate. The case for GaN technologies shows other mechanical limitations. We can predict the highest reachable curvature values, depending on microdisplay size and technology.
High refractive index glass wafers for AR waveguide technology: glass wafers in larger diameter to enable cost efficiency for consumer ready devices (Conference Presentation)
Berthold Lange
In the variety of optical solutions for Head Mounted Devices, the Waveguide Technology is widely believed to constitute one of the most promising approaches to realize affordable Augmented Reality (AR) / Mixed Reality (MR) experience while enabling full immersive user impression combined with smallest form factor and uncompromised image quality. In the race to highest Field-of-View (FoV) and best image quality the optical waveguide, made from specialty grade high index glass wafers, is the core component. The material and geometry characteristics of such wafers are directly determining image properties, such as maximum Field-of-View, contrast, brightness, distortion of the image, guided to the user’s eye. We report advances in making glass wafers larger in diameter, to enable cost efficiency of subsequent processes. We provide insight into SCHOTT’s material roadmap for combiner optics in Augmented Reality devices including high refractive index. Finally, we present results of collaborative research between glass, resin and equipment companies, targeting to provide mass producible solutions allowing optical component manufacturers to accelerate the merchantability of consumer ready devices.
Evaluation of augmented reality (AR) displays performance based on human visual perception
Conventional display metrics are currently used to evaluate the performance of augmented reality (AR) displays: the maximum luminance it can reach, the image uniformity, the contrast ratio and others. However, these metrics initially defined for non-transmissive displays, do not consider the impact of the see-through environment as it exists in AR applications. These measurements (usually obtained in a dark laboratory) are used to quantify the performance of AR display devices, but do not necessary relates to the perceived quality of images rendered by such devices, due to the non-linearity of human perception. Upon this, the projected image (Fig. 1) is an overlay upon the see-through environment. The intensity balance between these two "layers" becomes critical as the perceived scene (Fig. 2) is resulting from color mixing. Here, we want to highlight the fact that the brightness (perceived intensity) and color uniformity of the projected image becomes less important whereas the contrast ratio between the image and the environment becomes very critical. Other metrics are also less significant once the outside scene is added such as pixel sharpness (MTF) and the generated halo effects. These metrics are still important to evaluate the display performance for comparison purpose, but higher thresholds should be permitted for an AR display. The performance requirements must be adjusted in accordance to the AR application itself and not as it is done for conventional displays.
Digital Optics for 3D Imaging and 3D Display
icon_mobile_dropdown
XSlit cameras for free navigation with depth image-based rendering
Sarah Fachada, Gauthier Lafruit
In free navigation applications, any viewpoint to a three-dimensional scene can be synthesized through Depth Image-Based Rendering (DIBR). In this paper we show that XSlit cameras achieve up to 3 dB PSNR gain over conventional pinhole camera arrays when synthesizing virtual views to the scene with DIBR for small camera displacements. XSlit cameras are a type of general linear cameras where the light rays pass through two non-intersecting slits instead of a single point (the optical center of conventional cameras), resulting in a different epipolar geometry and projection equations. Instead of synthesizing virtual views with DIBR from a set of conventional pinhole cameras, e.g. a stereo camera pair, a single XSlit camera exploits the distance between its slits and their relative rotation to obtain disparity, out of which DIBR virtual views can be synthesized. We first present a theoretical study to present the parameters of XSlit cameras, different from pinhole cameras, at the same time making sure that the XSlit camera is physically implementable. We then validate the study with DIBR achieved on synthetic content using perfect depth maps obtained from an in-house modified version of Blender’s engine Cycles to simulate XSlit cameras. The virtual view synthesis uses an adapted version of the Reference View Synthesis software used in MPEG, the worldwide standardization committee for media compression. Our experiments show that for the same overall space covered by the studied camera architectures, XSlit cameras often obtain better DIBR view synthesis results, with up to 3 dB PSNR gain.
EEG based assessment of user performance for a volumetric multiplanar display
The rapid development of three-dimensional visualization technologies requires an accurate assessment of human factors. For this reason, electroencephalography has been broadly employed to investigate the user performance and comfort for stereoscopic visualization systems. However, no previous research investigated the cortical activity when viewing real three-dimensional images, such as demonstrated on the volumetric displays. The aim of this pilot study was to investigate the short-term changes in brain activity when viewing images on the volumetric multi-planar display. The visual search array consisted of four constant angular size circles. In each trial, one of the circles was displayed closer to the subject comparing to three others. The task was to find the closest circle and submit the answer about its relative location. Each participant performed visual search tasks in three repetitive sessions consisting of 80 trials. We used electroencephalography to record the electrical activity of the brain. No significant changes were found when comparing the results of ERP components between three repetitive visual task sessions. Regarding the frequency band power, the significant changes were proved only for the alpha band in the central parietal area (Pz) and in the right parietal area (P4). The alpha band power grew considerably in most individuals when compared the signals during the third session with the first session. In addition, there was a trend of increased activity in the beta band, however, no statistical significance was reached. In this work, we discuss the possible effects of visual task and three-dimensional visualization of information on the cortical activity.
11350 Additional Presentations
icon_mobile_dropdown
Wide field of view HOE-based waveguides system for AR display
Augmented reality (AR) systems are of huge interest for last decade since they are predicted to be the next generation of mobile devices for consumers. One of the key parameters in terms of AR systems properties is the field of view. The best performance in this regard is shown by DOE/HOE-based planar-waveguides systems since they can provide the widest field of view among other approaches even with the simplest architecture. However it is still not wide enough for consumers, so more complex architectures are created. In this work, a novel approach for reaching wide field of view is proposed. It is based on the eyebox magnification in two directions by two different waveguides systems. The first system provides magnification along the axis with wider field of view and consists of waveguides inclined along the field of view central beam with HOE-based 1D gratings, providing the TIR diffraction in both +1 and -1 orders. The TIR condition in this case is reached more easily because of inclination, so the wider angular spectrum can be transferred. The second system provides magnification along the axis with narrower field of view and consists of conventional HOE-based periscope system with in-coupling and out-coupling zones. The system working principle, HOEs specifications, main advantages, challenges and solutions are discussed. The proposed system allow 60-degrees diagonal field of view for the white (RGB) color.
Tunable lens for AR headset
The mismatch between positions of virtual images and a see-through view constitute a serious problem in virtual and augmented reality optical systems with a single projection plane. These issues may lead to a user’s discomfort: eye fatigue, headache and nausea. In order to solve these problems a tunable lens forming several projection planes at different locations can be used. Developed varifocal lens consists of two tunable liquid crystal cells. The first cell for fine adjustment varies optical power from 1 D to 3 D, the second cell for coarse adjustment varies power from 0.25 D to 1 D. The total dioptric range is -4 D … +4 D with an equidistant step of 0.25 D that forms 33 projection planes. Electrode pattern made of indium zinc oxide consists of rings corresponding to Fresnel zones, each zone is divided to multiple subzones. In order to minimize the number of control electrodes (bus lines) and keep high diffraction efficiency, the bus lines shunt together all of the corresponding sub-zones in all of the zones. Developed lens is tested with AR glasses based on a holographic waveguide. Displacement of virtual image from 250 mm to 1 meter is demonstrated.