Share Email Print

Proceedings Paper

Visual Feedback For Robotic Manipulations Under Arbitrary Loading
Author(s): Luis R. Lopez
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

An iterative series of linear and non-linear transformations can be utilized to map a user supplied position command to a set of self-adapting servo commands. Visual information from the manipulator can be mapped to a current position where it can then be used with the operator's position command to carry out an actuator command mapping. The newly generated actuator commands will change the manipulator position. Visual information about the new position completes a feedback loop that elicits an iterative chain of transformations of the visual information into control commands. Iterative transformations continue until the manipulator is within the desired position tolerance. This concept will dynamically adapt to arbitrary loads or changes in dynamical parameters. A series of transformations are the effective process of neural networks. Such processing architectures are capable of learning these transformations through exemplary inputs. A neural network system is presented that will accomplish the learning and execution of this iterative control scheme. Learning system design issues that arise in these systems are also discussed.

Paper Details

Date Published: 27 March 1989
PDF: 7 pages
Proc. SPIE 1002, Intelligent Robots and Computer Vision VII, (27 March 1989); doi: 10.1117/12.960332
Show Author Affiliations
Luis R. Lopez, Teledyne Brown Engineering (United States)

Published in SPIE Proceedings Vol. 1002:
Intelligent Robots and Computer Vision VII
David P. Casasent, Editor(s)

© SPIE. Terms of Use
Back to Top