Share Email Print
cover

Proceedings Paper

Composite pattern structured light projection for human computer interaction in space
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Interacting with computer technology while wearing a space suit is difficult at best. We present a sensor that can interpret body gestures in 3-Dimensions. Having the depth dimension allows simple thresholding to isolate the hands as well as use their positioning and orientation as input controls to digital devices such as computers and/or robotic devices. Structured light pattern projection is a well known method of accurately extracting 3-Dimensional information of a scene. Traditional structured light methods require several different patterns to recover the depth, without ambiguity and albedo sensitivity, and are corrupted by object motion during the projection/capture process. The authors have developed a methodology for combining multiple patterns into a single composite pattern by using 2-Dimensional spatial modulation techniques. A single composite pattern projection does not require synchronization with the camera so the data acquisition rate is only limited by the video rate. We have incorporated dynamic programming to greatly improve the resolution of the scan. Other applications include machine vision, remote controlled robotic interfacing in space, advanced cockpit controls and computer interfacing for the disabled. We will present performance analysis, experimental results and video examples.

Paper Details

Date Published: 19 May 2005
PDF: 11 pages
Proc. SPIE 5798, Spaceborne Sensors II, (19 May 2005); doi: 10.1117/12.603808
Show Author Affiliations
Chun Guan, Univ. of Kentucky (United States)
Laurence G. Hassebrook, Univ. of Kentucky (United States)
Daniel L. Lau, Univ. of Kentucky (United States)
Veera Ganesh Yalla, Univ. of Kentucky (United States)


Published in SPIE Proceedings Vol. 5798:
Spaceborne Sensors II
Peter Tchoryk; Brian Holz, Editor(s)

© SPIE. Terms of Use
Back to Top