Share Email Print
cover

Proceedings Paper

Hand-eye coordination for grasping moving objects
Author(s): Peter K. Allen; Billibon Yoshimi; Alexander Timcenko; Paul Michelman
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

Most robotic grasping tasks assume a stationary or fixed object. In this paper, we explore the requirements for grasping a moving object. This task requires proper coordination between at least 3 separate subsystems: dynamic vision sensing, real-time arm control, and grasp control. As with humans, our system first visually tracks the object’s 3-D position. Because the object is in motion, this must be done in a dynamic manner to coordinate the motion of the robotic arm as it tracks the object. The dynamic vision system is used to feed a real-time arm control algorithm that plans a trajectory. The arm control algorithm is implemented in two steps: 1) filtering and prediction, and 2) kinematic transformation computation. Once the trajectory of the object is tracked, the hand must intercept the object to actually grasp it. We present 3 different strategies for intercepting the object and results from the tracking algorithm.

Paper Details

Date Published: 1 April 1991
PDF: 13 pages
Proc. SPIE 1383, Sensor Fusion III: 3D Perception and Recognition, (1 April 1991); doi: 10.1117/12.25255
Show Author Affiliations
Peter K. Allen, Columbia Univ. (United States)
Billibon Yoshimi, Columbia Univ. (United States)
Alexander Timcenko, Columbia Univ. (United States)
Paul Michelman, Columbia Univ. (United States)


Published in SPIE Proceedings Vol. 1383:
Sensor Fusion III: 3D Perception and Recognition
Paul S. Schenker, Editor(s)

© SPIE. Terms of Use
Back to Top