Share Email Print

Proceedings Paper

Explorations In A Sensor Fusion Space
Author(s): Peter K. Allen; Kenneth S. Roberts; Paul Michelman
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

Using a dextrous hand involves understanding the relation-ships between the three kinds of sensor outputs available: joint space positions, tendon forces, and tactile contacts. Dextrous manipulation then is a fusion of these three sensing modalities. This paper is an exploration of using a dextrous, multi-fingered hand (Utah/MIT hand) for high-level object recognition tasks. The paradigm is model-based recognition in which the objects are modeled and recovered as superquadrics, which are shown to have a number of important attributes that make them well suited for such a task. Experiments have been performed to recover the shape of objects using sparse contact point data from the hand kinematics and forces with promising results. We also present our approach to using tactile data in conjunction with the dextrous hand to build a library of grasping and exploration primitives that can be used in recognizing and grasping more complex multi-part objects.

Paper Details

Date Published: 5 January 1989
PDF: 6 pages
Proc. SPIE 1003, Sensor Fusion: Spatial Reasoning and Scene Interpretation, (5 January 1989); doi: 10.1117/12.948940
Show Author Affiliations
Peter K. Allen, Columbia University (United States)
Kenneth S. Roberts, Columbia University (United States)
Paul Michelman, Columbia University (United States)

Published in SPIE Proceedings Vol. 1003:
Sensor Fusion: Spatial Reasoning and Scene Interpretation
Paul S. Schenker, Editor(s)

© SPIE. Terms of Use
Back to Top