Touch screens go optical

A simple optical implementation of a touch screen is made possible by disrupting the total internal reflection in a 2D waveguide.
11 December 2012
Steen Grüner Hanson, Michael Linde Jakobsen and Henrik C. Pedersen

The use of touch screens has expanded in the last decade, mainly due to the introduction of smartphones and tablet computers which reached annual sales of more than 400 million units in 2009.1 This number is anticipated to increase steadily in the next decade. Out of the dozen or so touch-screen technologies available, two captured the market: capacitive and resistive. The capacitive touch screen, which works by changing the local capacitance of metallic layers in the screen, reacts to very light touches and is durable but expensive to manufacture. Resistive touch screens, where conducting layers are separated by a flexible material that is compressed locally on touch, are cheaper but delicate to fabricate, especially for larger sizes.

In our work, we investigated the opportunity for use of optics in this field. Optical touch screens have been on the market for several decades,2 primarily those where a light curtain, involving a series of emitters and detectors, hovers above the screen's surface. Other groups devised systems based on disrupting total internal reflections in a waveguide,3–5 but their methods rely on having arrays of emitters and receivers placed at the sides of the waveguide, which drives the cost of these technologies and inhibits their dissemination.

We devised a touch screen where light is confined in a waveguide that reacts to touch (see Figure 1). If light is injected in this waveguide at an angle below a critical angle, which is determined by the refractive index of the guide and the surrounding medium, the light will be captured inside it. A change in the refractive index—through, for example, a finger touching the waveguide—results in out-coupling of the otherwise trapped light. The same would not happen if, for example, water came into contact with the waveguide's surface because water has a lower refractive index (1.33) than a finger (1.47).


Figure 1. A waveguide-based touch screen. A video of the working principle is available online.6

The call for a simple and affordable replacement to present technologies implies that we use only one light source and one detector array, preferably placed adjacently. Figure 2 shows our basic system implemented as a 2D waveguide.


Figure 2. Top view of the touch module comprising a polymer waveguide with specially shaped edges, a light source V, and a linear detector array D.A. Cx, y are Fresnel-shaped collimating edges, Fx, y are Fresnel-shaped focusing edges, and T is a touch point. The two x- and y-parts of the detector array are dedicated to the x- and y-interrogating rays, respectively.

Light from a vertical cavity surface emitting laser (VCSEL) is coupled in at the lower right corner of the waveguide (see Figure 2). The light is dispersed within the entire waveguide by the inherent negative lens. It is transmitted in the plane of the waveguide, thus it is not disturbed by an object placed at the surface. The light hits the opposite sides of the waveguide Cy, x (Fresnel-shaped collimating edges), and one half is guided in the negative x-direction while the other half is guided in the negative y-direction. Not only are the rays now guided along the main directions by the facets, the facets themselves are slightly tilted to make these rays bounce up and down as they cross the waveguide. Therefore, a finger (or an object of similar refractive index) placed at point T will be able to couple out the light.

Next, the rays moving in the negative x and y directions hit the right and lower facets, Fx, y, respectively, and are focused at the upper-left corner of the guide. Again, these right and lower facets are tilted to make the bouncing rays move along the waveguide plane and avoid sensitivity to any surface disturbance. Placing a linear photosensitive detector array (CMOS array) at a distance from the focal position facilitates the mapping of the incidence angles uniquely at a detector element in the linear array. At the onset, the linear ‘image’ of the detector array is recorded and, subsequently, any change will uniquely depict the x- and y-position of the disturbance, here being a finger.

In our work, we established a system using a mold with delicately tilted Fresnel reflectors.7 We injection-molded the waveguide in acrylic. The VCSEL (with wavelength of 670nm) was used as a light source, and a CMOS camera functioned as a detector array. Figure 3 shows the implementation and its response for a 5×5 keypad of size 40×40 mm.


Figure 3. (left) Test of the fabricated touch module: laser light is launched at the upper left corner and a camera is placed at the opposite corner. (center) The green curve is the untouched camera signal, the red is the touched camera signal, and the black curve is the difference between the two. (right) Result of the peak-finding algorithm that determines the touch location. The video showing the response of the optical waveguide to touch is available online.8

In case of two touches, the present setup generates two x- and two y-values, which in most cases cannot be properly paired. This is usually named a ‘one-and-a-half-finger system’ which suffices for pinching, scrolling, and rotation. Our investigations are currently aimed at placing the electronics in the same corner and realizing true multi-finger systems while not sacrificing the simplicity and robustness of the basic concept.

The authors thank Opdi Technologies A/S for supporting this development.


Steen Grüner Hanson, Michael Linde Jakobsen, Henrik C. Pedersen
Department of Photonics Engineering
Technical University of Denmark
Roskilde, Denmark

Steen Grüner Hanson is a professor. Michael Linde Jakobsen and Henrik C. Pedersen are senior scientists.


References:
1. M. J. Miller, Fragmented touch-screen tech drives forward, PC Magazine, 3 March 2010.
2. I. Maxwell, An overview of optical-touch technologies, Inf. Disp. 12, p. 26, 2007.
3. R. G. Johnson, D. Fryberger, Touch actuable data input panel assembly, US patent 3673327, 1972.
4. H. Ma, J. A. Paradiso, Object's location detection apparatus in touch panel, has sensors provided at panel periphery, to detect change in light intensity at detection points on panel surface due to presence of object, US patent 2004252091-A1, 2004.
5. J. Moeller, A. Kerne, Scanning FTIR: unobtrusive optoelectronic multi-touch sensing through waveguide transmissivity imaging, Proc. 4th Int'l Conf. Tangible, Embedded, and Embodied Interaction, p. 73-76, 2010.
6. http://spie.org/documents/newsroom/videos/4584/Touchpad_1.avi Video showing how light is coupled out of a waveguide by the touch of a finger. Accessed 4 December, 2012.
7. H. C. Pedersen, M. L. Jakobsen, S. G. Hanson, M. Mosgaard, T. Iversen, J. Korsgaard, Optical touch screen based on waveguide sensing, Appl. Phys. Lett. 99, p. 061102, 2011.
8. http://spie.org/documents/newsroom/videos/4584/Touchpad_2.wmv Video showing the response to the touch of the optical waveguide. The keypad is shown to the right and the response at the lower left. The CCD array signal is presented at the upper right chart, and the change in reading is shown in the panel below. Accessed 4 December, 2012.
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research