Tangible imaging systems merge the real and virtual worlds

Combining the power of computer graphics with the interactive capabilities of mobile devices creates digital representations of 3D objects that can be manipulated as naturally as real ones.
29 January 2013
James Ferwerda

When observers interact with an object, they typically engage in a complex set of behaviors that include active manipulation and dynamic viewing. These behaviors enable observers to see how the object looks from different perspectives and under different lighting conditions. This provides rich visual information about the object's shape and material properties. Modern computer graphics methods provide the ability to render realistic images of complex objects at interactive rates. This has enabled great advances in fields such as digital cinema, computer gaming, medical imaging, and cultural heritage archiving. However, in standard graphics systems, the observer is always one step removed from the object. That is, images of the object are typically viewed on a screen, and interaction is usually mediated through interface devices such as a keyboard or mouse.

We are developing tangible imaging systems1–4 that enable natural interaction with virtual objects. Tangible imaging systems are based on modern mobile devices—such as laptop computers and tablets—that typically include touch screens, digital cameras, accelerometers, gyroscopes, and graphics hardware, in addition to traditional displays and input devices. We have developed custom software that takes advantage of these components to allow the orientation of a device and the position of the observer to be tracked in real time. Using this information, realistic images of 3D objects with complex shapes and material properties are rendered to the screen, and tilting or moving in front of the device produces realistic changes in surface appearance. Tangible imaging systems thus enable virtual objects to be observed and manipulated as naturally as real ones with the added benefit that object properties can be modified under the control of the user.

To date, we have developed four systems: tangiBook, our first implementation of the concept on a laptop computer; tangiView, a refined implementation on a tablet device; tangiPaint, a digital painting system that enables users to directly create paintings with rich surface textures and material properties; and phantoView, an implementation that takes the tangible imaging concept into stereoscopic 3D.


Figure 1. The tangiBook system—tilting the display (top row) or moving in front of it (bottom row) produces realistic changes in the appearance of the rendered surface.

The capabilities of the tangiBook system are illustrated in Figure 1, which shows a 3D model of an oil painting being rendered to the laptop's display in real time. The images in the top row show how tilting the screen produces realistic changes in surface shading and highlights. The images in the bottom row show the corresponding changes that occur when the observer moves relative to the screen/painting. Taken together, the realistic rendering and direct interaction capabilities of the system provide an experience that is similar to interacting with a real painting.


Figure 2. tangiView—an implementation of the tangible imaging concept on a tablet device. The natural form-factor and touch screen facilitate direct interaction.

Although the capabilities of the tangiBook are novel and compelling, tablets provide a much better form-factor for developing tangible imaging systems because they are lighter and are shaped much more like the nearly flat surfaces they are being used to simulate. tangiView is our system for the Apple iPad 2 tablet (see Figure 2). All the rendering and display capabilities are the same as those of the tangiBook system, but tangiView also incorporates a touch-driven interface and improves versatility by providing an image-based format for 3D surface models that can be shared using standard photo applications.


Figure 3. tangiPaint—with a finger on the tablet, users can create digital ‘oil paintings’ that have rich 3D surface textures and material properties. Tilting the tablet changes the lighting on the painting to reveal these features.

Given the touch-based interaction capabilities of tablet devices, our efforts have been focused on the exciting possibility of creating tangible content rather than just viewing existing models. This new system is known as tangiPaint (see Figure 3). tangiPaint lets users create virtual ‘oil paintings’ with their fingers on the tablet's touch screen. Users can select different textured ‘canvases’ to paint on and can work with ‘paints’ and ‘brushes’ that produce realistic-looking paintings with complex variations in surface color, gloss, and texture.

These tangible imaging systems have been well received, but a frequent question that arises is whether the systems can be used to interact with 3D objects. To explore this question, we developed a system called phantoView, which supports direct interaction with digital models of 3D objects. The unique feature of phantoView is its use of a special kind of stereoscopic rendering known as a phantogram, a 2D image that appears as 3D, to create compelling stereoscopic visualizations of the objects.


Figure 4. The phantoView system—3D object models loaded into the system are rendered in real time as phantograms to the tablet's display. A rendered object appears to be sitting on the tablet's surface. Rotating the tablet about the vertical axis allows the object to be seen from other viewpoints.

Figure 4 shows the phantoView system in action (the views of the object are simulated because of the problems of photographing stereoscopic images). To begin, an object model is loaded from a standard Wavefront object (OBJ) file. A phantogram of the object is then rendered to the screen in real time. To see the object from different viewpoints, the user simply rotates the tablet around a vertical axis, an experience that simulates manipulating an object sitting on the tablet. A pinch gesture scales the object, a two-finger swipe changes the azimuth and elevation of the scene lighting, and a single-finger swipe switches between loaded object models.

In summary, tangible imaging systems merge the power of computer graphics with the direct interaction capabilities of modern mobile devices to create rich digital representations of 3D objects that can be viewed and manipulated as naturally as real ones. Although the systems described here represent some promising first efforts, we are still working to develop this technology, creating better rendering, modeling, and interaction methods, and running psychophysical experiments to evaluate how well tangible images convey appearance properties. In addition, we are developing applications of tangible imaging systems in the areas of computer-aided appearance design and communication and enhanced access to digital library and museum collections. At the time of writing, versions of some of the applications described are available at www.tangibleimagingsystems.com.

This work was conducted in collaboration with Reynold J. Bailey, Anthony M. Blatner, Benjamin A. Darling, and Okka Kyaw. Some of the work was supported by National Science Foundation grants CCF-0811032 and IIS-0113310 to James Ferwerda.


James Ferwerda
Rochester Institute of Technology
Rochester, New York

Dr. Ferwerda is an associate professor and Xerox chair in the Chester F. Carlson Center for Imaging Science.


References:
1. B. A. Darling, J. A. Ferwerda, The tangiBook: a tangible display system for direct interaction with virtual surfaces, Proc. IS&T 17th Color Imaging Conf., p. 260-266, 2009.
2. B. A. Darling, J. A. Ferwerda, Tangible display systems: direct interfaces for computer-based studies of surface appearance, Proc. SPIE Human Vision and Electron. Imaging 72570Q, p. 1-12, 2010. doi:10.1117/12.845182
3. A. M. Blatner, J. A. Ferwerda, B. A. Darling, R. J. Bailey, tangiPaint: a tangible digital painting system, Proc. IS&T 19th Color Imaging Conf., p. 120-107, 2011.
4. J. A. Ferwerda, Tangible imaging systems, Proc. SPIE Electron. Imaging (Imaging and Printing in a Web 2.0 World IV), 2013.
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research