Augmented reality—technologies that enhance the real world by overlaying digital information on an object being viewed—have been much in the spotlight recently owing to rapid advances in display technologies and computer vision science.1 Google, for instance, has developed a head-mounted display called Google Glass that can provide virtual reality by superimposing 2D computer-generated images on the transparent medium. This type of digital enhancement could be made even more useful and immersive if 3D imaging could be incorporated into the augmented reality view. However, most existing augmented reality systems can only provide 2D virtual images to the user.
To fill this gap, there has been much recent work aimed at incorporating 3D virtual images in augmented reality systems. For example, studies by Takaki and colleagues and by Hong and colleagues investigated using autostereoscopic 3D display technologies (i.e., ‘glasses-free’ methods that do not require special headgear to create a perception of depth in an image) for augmented reality.2, 3 However, both those proposed solutions call for additional optical systems that make them unwieldy and complicated to fabricate. Since compact systems are preferred for most envisaged practical applications, those complex optical systems pose an obstacle to implementing 3D imaging in augmented reality. To overcome these drawbacks, our group is pursuing a new approach based on developing a novel holographic optical element (HOE) that can be used as a 3D screen in place of the additional optical systems currently required for 3D imaging.4
The HOE is a diffractive grating structure constructed by hologram recording that can perform the optical functions of conventional elements such as lenses and mirrors.5 In addition to satisfying the see-through property with relatively high diffraction efficiency, the HOE has the advantages of a thin structure and low cost. This makes the HOE an attractive alternative to conventional optics for implementing an image combiner (a system that merges an overlaid image with the real scene) for augmented reality.6 In our work, we developed an HOE lens array to create a see-through 3D screen for use in augmented reality systems. The HOE lens array performs the same functions as the conventional microlens arrays used in ‘integral imaging,’ one of the techniques used to create autostereoscopic 3D displays. Integral imaging employs a set of 2D images taken from different perspectives (called elemental images) to display an image in a way that looks different depending on viewing angle. In our solution, a set of elemental images is projected on the see-through 3D screen, where the HOE lens array reflects and directs them to provide a 3D view.
The recording scheme is illustrated in Figure 1(a). We used a photopolymer film as the holographic material because it offers the advantages of full-color recording and optically clear characteristics. An array of spherical waves formed by the conventional lens array and a plane-wave reference beam strike the photopolymer from different directions to generate a reflective-type hologram as an interference pattern. For full-color recording, we used combined lasers with the primary colors red (633nm), green (532nm), and blue (473nm) for both the signal and reference beams. By the principles of holography, the HOE lens array reconstructs waves duplicating those of the conventional lens array when a displaying beam with wavelengths and directions identical to those of the reference beam in the recording scheme is projected on it: see Figure 1(b). Now the difference is that the displaying beam delivers an image, while the reference beam in the recording process is just a plane wave.
Figure 1. Schematic diagrams illustrating the principles of the holographic optical element (HOE) lens array: (a) recording and (b) reconstruction procedures.
Since the recording scheme employs a collimated reference beam with a specific incidence angle, the light from the imaging device for the reconstructing HOE lens array must be collimated with an identical incidence angle. When elemental images that satisfy the collimation and incidence angle constraints are projected on the HOE lens array, it acts as a 3D screen by the principles of projection-type integral imaging.7 The HOE lens array is also see-through because it functions as an image combiner only for Bragg-matched light (i.e., where the incident light wavelength and direction are identical to those of the reference beam used in hologram recording). The 3D imaging and see-through properties of our solution are both illustrated in Figure 2.
Figure 2. Views from different perspectives with the 3D screen display proper disparities according to the observing direction. In this experimental example the 3D virtual images are the letters (S, N, and U) and the real-world background object is the textured cube.
In summary, we developed a novel see-through screen based on an HOE lens array for use in 3D augmented reality. We are now further extending this approach to customize the viewing characteristics of the 3D image on the see-through screen, and developing an algorithm for achieving appropriate color representation in this proposed method.
This research was supported by the Korean Ministry of Science, ICT and Future Planning as part of the Giga-KOREA Project (GK13D0200: Development of Super Multi-View (SMV) Display Providing Real-Time Interaction).
Byoungho Lee, Keehoon Hong
School of Electrical Engineering
Seoul National University (SNU)
Seoul, Republic of Korea
Byoungho Lee received a PhD from the Department of Electrical Engineering and Computer Science, University of California, Berkeley (1993). Since 1994, he has been a faculty member at SNU. He is a Fellow of SPIE, IEEE, and the Optical Society.
Keehoon Hong received a BSc from the School of Electrical and Electronic Engineering at Yonsei University in Seoul, Korea (2008). He is currently working toward a PhD at SNU.
1. B. Lee, Three-dimensional displays, past and present, Phys. Today
66(4), p. 36-41, 2013. doi:10.1063/PT.3.1947
2. Y. Takaki, Y. Urano, S. Kashiwada, H. Ando, K. Nakamura, Super multi-view windshield display for long-distance image information presentation, Opt. Express
19(2), p. 704-716, 2011. doi:10.1364/OE.19.000704
3. J. Hong, S.-W. Min, B. Lee, Integral floating display systems for augmented reality, Appl. Opt.
51(18), p. 4201-4209, 2012. doi:10.1364/AO.51.004201
4. K. Hong, J. Yeom, C. Jang, J. Hong, B. Lee, Full-color lens-array holographic optical element for three-dimensional optical see-through augmented reality, Opt. Lett.
39(1), p. 127-130, 2014. doi:10.1364/OL.39.000127
5. H. J. Coufal, G. T. Sincerbox, D. Psaltis, Holographic Data Storage, Springer, New York, 2000.
6. H. Mukawa, K. Akutsu, I. Matsumura, S. Nakano, A full color eyewear display using holographic planar waveguides, SID Int'l Symp. Dig. Tech. Papers
39(1), p. 89-92, 2008. doi:10.1889/1.3069819
7. J. Hong, Y. Kim, S.-g. Park, J.-H. Hong, S.-W. Min, S.-D. Lee, B. Lee, 3D/2D convertible projection-type integral imaging using concave half mirror array, Opt. Express
18(20), p. 20628-20637, 2010. doi:10.1364/OE.18.020628