SPIE Membership Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
    Advertisers
SPIE Photonics West 2018 | Call for Papers

OPIE 2017

OPIC 2017

SPIE Journals OPEN ACCESS

SPIE PRESS

SPIE PRESS




Print PageEmail Page

Sensing & Measurement

Lab studies human, machine, and computer touch

From OE Reports Number 186 - June 1999
31 June 1999, SPIE Newsroom. DOI: 10.1117/2.6199906.0003

Researchers are investigating the human sense of touch and developing equipment both for investigating human touch and for touch-based computer interfaces at MIT's Laboratory for Human and Machine Haptics (Cambridge, MA).

Touch Lab Director Mandayam Srinivasan said, "By haptics we mean almost anything to do with manual exploration and manipulation."

While the sense of touch is not restricted to hands, the only part of the body that can rival the concentration and sensitivity of touch receptors in the fingertips is the tongue. Touch can also sense a variety of properties, such as shape, softness, texture, friction, temperature, weight, and movement.

The lab's work has implications for computer interfaces to virtual reality, which can be used for training, tele-operation, and entertainment applications, as well as longer-term application to designing better robot hands.

Srinivasan uses analogies to vision-based systems to explain the different areas of work: "human haptics," which examines how the hand-brain system works, is much like research into how eyes and the brain work together; "machine haptics," which is concerned with building machines that can act like hands, is analogous to designing machine vision systems; and "computer haptics," which involves developing algorithms and software for displaying tactual displays, is akin to work in computer graphics.

Even among groups studying human haptics, the Touch Lab is rare in that it ties together research into the biomechanics of touch, the neurophysiology (what signals are transmitted from receptors to the brain), psychophysics (how fine a signal can be perceived), and motor control (how well the hand can be controlled). In addition, the researchers work on the development of machine and computer haptics.


Human-machine haptic interaction. The human sense of touch involves a closed loop system of receptors sensing, transmitting messages to and from the brain, thinking, and manipulating. Haptic interfaces require a similar system, but one that is electromechanical and computer-based.
Virtual touch

Computer haptics software concerned with generating and rendering touchable virtual objects is analogous to computer graphics. A commercial haptic interface, the PHANToM from Sensable Technologies (Cambridge, MA), can be thought of as analogous to a computer display screen. The user either grasps a stylus or puts a finger in a thimble and uses them to touch, feel, and manipulate virtual objects within the device's workspace, which can be large enough to require use of a user's entire arm.

The PHANToM is about the size of a desk lamp, with 6 degrees of freedom, designed to have low friction, inertia, and mass, so that the user feels the programmed attributes of the object rather than those of the device. The touch display can be programmed to contain stationary or moving objects, with characteristics including a shape, texture, deformation, and other physical and environmental properties that can change with time (such as a gravitation constant that changes).

As the user explores the space, the object's properties become evident. Srinivasan explained, "we could program a doughnut shape, put in textures and friction to make it a sticky doughnut, with virtual jelly in the middle that feels gooey, and make it fly away from you as you attempt to grab it."

The computing requirements are considerably different than for computer graphics. The modeling algorithms and haptic interface device (which is a type of robot) must be quite fast. Computer graphics can project video at 30 frames/s, and persistence of vision allows us to see continuous motion. But humans can sense temporal frequencies at more than 1 kHz by touch and therefore the rendering rates must be on the order of kilohertz.

The group's research into the psychophysics of touch also provides information needed for designing better interfaces. For example, Touch Lab researchers have discovered the finest differences in texture and compliance that humans can discern, and how vision or sound affect touch perception. (A student is currently working on developing a thermal simulator in collaboration with the MIT AI Lab.)

Some force-feedback devices are available commercially now -- such as arcade games that shake, Power Gloves, and force-feedback joysticks. However, these offer very coarse, in some cases binary, feedback. The PHANToM, which costs about $10,000 to $15,000, is two orders of magnitude more expensive than these toys, but also offers a far wider range of displacement and force, better resolution, better bandwidth, and 6 d.o.f. compared to 1 or 2. Several other high-quality devices are being marketed by other companies.

Potential applications

One ideal application of the interface system would be training surgeons for minimally invasive surgery. In performing this kind of surgery, surgeons watch a video image of the area and "feel" using tools, because they cannot reach the surgery site with their fingers. Training occurs on the job.

A virtual-reality-based training system with haptic and graphic displays would be just as beneficial to a surgeon as a flight simulator is to a pilot. Such a system requires realistic models of tissues and organs, which Srinivasan's group is building now.

Also, a haptic interface can be used in computer-aided design to allow assembly of items before a physical prototype is made. In a car engine, for example, the designer could find out whether a design allows enough room to change the oil filter much more easily using haptic displays than by purely graphical displays. Other applications include tele-operation of remote vehicles and equipment, education, and (as always) entertainment.

Better robots by better understanding

In the long term, understanding how human touch works may advance the state of robotics. "The best current robotic hands don't have the capability of a two-year-old" said Srinivasan.

On the other hand, robotic hands can offer capabilities that human hands do not, such as the ability to withstand high temperatures, or fingers that rotate 360 deg. "Many of the things hands do," Srinivasan said, "depend on the proper integration of the mechanical, sensorimotor, and cognitive systems."

Not everything is hardware -- software, learning, memory, and intelligence are also important. Human haptics can be used as inspiration, although not always as models. "People tend to take hand functions for granted," Srinivasan said, "and underestimate the complexity of the hand."


Yvonne Carts-Powell
Yvonne Carts-Powell, based in Boston, writes about optoelectronics and the Internet.