
Proceedings Paper
Evolving neural networks for video attitude and height sensorFormat | Member Price | Non-Member Price |
---|---|---|
$17.00 | $21.00 |
Paper Abstract
The development of an on-board video attitude and height sensor (VAHS) which is used to measure the height, roll, and pitch of an airborne vehicle at low altitude, is presented. The VAHS consists of a downlooking TV camera and two orthogonal sets of laser diodes (total four laser diodes) producing a structured light pattern. Although the height, roll, and pitch can be determined by measuring the locations of the dots in the image, in practice it is very difficult to precisely align the laser diodes and the TV camera. Moreover it is also very hard to obtain accurate camera parameters, because of its various nonlinear distortions. An approach which uses layered neural networks (NNs) to map the locations of the dots in the image to the height, roll, and pitch of the airborne vehicle, is presented here. Amorphous NNs have also been evolved by genetic algorithms (GA) with mixed results. Some simulation results of these experiments are presented.
Paper Details
Date Published: 5 July 1995
PDF: 11 pages
Proc. SPIE 2484, Signal Processing, Sensor Fusion, and Target Recognition IV, (5 July 1995); doi: 10.1117/12.213031
Published in SPIE Proceedings Vol. 2484:
Signal Processing, Sensor Fusion, and Target Recognition IV
Ivan Kadar; Vibeke Libby, Editor(s)
PDF: 11 pages
Proc. SPIE 2484, Signal Processing, Sensor Fusion, and Target Recognition IV, (5 July 1995); doi: 10.1117/12.213031
Show Author Affiliations
Zhixiong Zhang, George Mason Univ. (United States)
Kenneth J. Hintz, George Mason Univ. (United States)
Published in SPIE Proceedings Vol. 2484:
Signal Processing, Sensor Fusion, and Target Recognition IV
Ivan Kadar; Vibeke Libby, Editor(s)
© SPIE. Terms of Use
