Share Email Print
cover

Proceedings Paper

Eyeball camera based calibration and performance verification for spatial computing systems
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Spatial computing enables overlay of the digital world over the real world in a spatially interactive manner by merging digital light-fields, perception systems, and computing. The digital content presented by the spatial computing needs to work tandemly with real-world surroundings, and more importantly the human eye-brain system, which is the ultimate judge for system success. As a result, to develop a spatial computing system, it would be essential to have a proxy for the human eye-brain to calibrate and verify the performance of the spatial computing system. This paper proposes a novel camera design for such purpose which mimics human ocular anatomy and physiology in the following aspects: geometry, optical performance and ocular motor control. Specifically, the proposed camera not only adopts the same corneal and pupil geometry from human eye, also the iris and pupil can be configured with multiple texture, color and diameter options. Furthermore, the resolution of eyeball camera is designed to match the acuity of typical 20/20 human vision, and focus can be dynamically adjusted from 0 to 3 diopters. Lastly, a pair of eyeball cameras are mounted independently on two hexapods to simulate the eye gaze and vergence. With the help of the eyeball cameras, both perceived virtual and real world can be calibrated and evaluated in a deterministic and quantifiable eye conditions like pupil location and gaze. Principally, the proposed eyeball camera serves as a bridge which combines all the data from spatial computing like eye tracking, 3D geometry of the digital world, display color accuracy/uniformity, and display optical quality (sharpness, contrast, etc) for a holistic view, which helps to effectively blend the virtual and real worlds together seamlessly.

Paper Details

Date Published: 19 February 2020
PDF: 6 pages
Proc. SPIE 11310, Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR), 113101H (19 February 2020); doi: 10.1117/12.2553259
Show Author Affiliations
Zhiheng Jia, Magic Leap, Inc. (United States)
Hyunsun Chung, Magic Leap, Inc. (United States)
Jeffrey Daiker, Magic Leap, Inc. (United States)
Sina Sedighi, Magic Leap, Inc. (United States)
Nicholas Morley, Magic Leap, Inc. (United States)
Daniel Dominguez, Magic Leap, Inc. (United States)
Jeremy Grata, Magic Leap, Inc. (United States)
Hudson Welch, Magic Leap, Inc. (United States)


Published in SPIE Proceedings Vol. 11310:
Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR)
Bernard C. Kress; Christophe Peroz, Editor(s)

© SPIE. Terms of Use
Back to Top
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?
close_icon_gray