Share Email Print
cover

Optical Engineering

Relative spatial pose estimation for autonomous grasping
Author(s): Steve Roach; Michael Magee
Format Member Price Non-Member Price
PDF $20.00 $25.00

Paper Abstract

A technique for finding the relative spatial pose between a robotic end effector and a target object to be grasped without a priori knowledge of the spatial relationship between the camera and the robot is presented. The transformation between the coordinate system of the camera and the coordinate system of the robot is computed dynamically using knowledge about the location of the end effector relative to both the camera and the robot. A previously developed computer vision technique is used to determine the pose of the end effector relative to the camera. The robot geometry and data from the robot controller is used to determine the pose of the end effector relative to the robot. The spatial transformation between the robot end effector and the target object is computed with respect to the robot’s coordinate system. The algorithm was demonstrated using a five-degree-of-freedom robot and an RGB camera system. The camera can be dynamically positioned without concern for an assumed spatial relationship between the camera and robot, enabling optimization of the view of the object and the end effector. Further, the iterative nature of the grasping algorithm reduces the effects of camera calibration errors.

Paper Details

Date Published: 1 December 1997
PDF: 9 pages
Opt. Eng. 36(12) doi: 10.1117/1.601586
Published in: Optical Engineering Volume 36, Issue 12
Show Author Affiliations
Steve Roach, Univ. of Wyoming (United States)
Michael Magee, Southwest Research Inst (United States)


© SPIE. Terms of Use
Back to Top