Share Email Print
cover

Proceedings Paper

Robot free-flyers in space extravehicular activity
Author(s): Harald J. Weigl; Harold L. Alexander
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

The Laboratory for Space Teleoperation and Robotics is developing a neutrally buoyant robot for research into the automatic and teleoperated (remote human) control of unmanned robotic vehicles for use in space. The goal of this project is to develop a remote robot with maneuverability and dexterity comparable to that of a space-suited astronaut with a manned maneuvering unit, able to assume many of the tasks currently planned for astronauts during extravehicular activity (EVA). Such a robot would be able to spare the great expense and hazards associated with human EVA, and make possible much less expensive scientific and industrialization exploitation of orbit. Both autonomous and teleoperated control experiments will require the vehicle to be able to automatically control its position and orientation. The laboratory has developed a real-time vision-based navigation and control system for its underwater space robot simulator, the Submersible for Telerobotic and Astronautical Research (STAR). The system, implemented with standard, inexpensive computer hardware, has excellent performance and robustness characteristics for a variety of applications, including automatic station-keeping and large controlled maneuvers. Experimental results are presented indicating the precision, accuracy, and robustness to disturbances of the vision-based control system. The study proves the feasibility of using vision-based control and navigation for remote robots and provides a foundation for developing a system for general space robot tasks. The complex vision sensing problem is reduced through linearization to a simple algorithm, fast enough to be incorporated into a real-time vehicle control system. Vision sensing is structured to detect small changes in vehicle position and orientation from a nominal positional state relative to a target scene. The system uses a constant, linear inversion matrix to measure the vehicle positional state from the locations of navigation features in an image. This paper includes a description of the underwater vehicle's vision-based navigation and control system and applications of vision-based navigation and control for free-flying space robots. Experimental results from underwater tests of STAR's vision system are also presented.

Paper Details

Date Published: 1 November 1992
PDF: 12 pages
Proc. SPIE 1829, Cooperative Intelligent Robotics in Space III, (1 November 1992); doi: 10.1117/12.131722
Show Author Affiliations
Harald J. Weigl, Massachusetts Institute of Technology (United States)
Harold L. Alexander, Massachusetts Institute of Technology (United States)


Published in SPIE Proceedings Vol. 1829:
Cooperative Intelligent Robotics in Space III
Jon D. Erickson, Editor(s)

© SPIE. Terms of Use
Back to Top