
Proceedings Paper
Calibration of a vision-based location system with Hybrid Genetic-Newton MethodFormat | Member Price | Non-Member Price |
---|---|---|
$17.00 | $21.00 |
Paper Abstract
To correct the uncertainty of the vision-based location system, a Hybrid Genetic-Newton Method (HGNM) is presented to calibrate its camera model. This method can minimize the uncertainty of the camera model by fusing the Genetic Algorithm (GA) and Newton method together. First, the camera model of the vision-based location system is built according to the image-forming rule and space geometry transformation principle of its visual measuring device. Second, the initial camera parameters generated by genetic process are iterated by Newton method until it meets the required accuracy. Otherwise, new populations will be generated again by GA and reiterated by Newton method. Third, a novel vision-based location system is designed to illustrate the application advantages of the modeling framework. The experimental result shows that the absolute error range of HGNM is [-1.1, 1.0] mm and the relative error range is [-9.49%, 0.11%]. It reveals that the accuracy of HGNM is about four times higher than LM method and up to six times higher than Newton method. In all, the HGNM is superior to traditional method when it comes to camera model calibration of the vision-based location system.
Paper Details
Date Published: 13 November 2019
PDF: 6 pages
Proc. SPIE 11343, Ninth International Symposium on Precision Mechanical Measurements, 113430Q (13 November 2019); doi: 10.1117/12.2548396
Published in SPIE Proceedings Vol. 11343:
Ninth International Symposium on Precision Mechanical Measurements
Liandong Yu, Editor(s)
PDF: 6 pages
Proc. SPIE 11343, Ninth International Symposium on Precision Mechanical Measurements, 113430Q (13 November 2019); doi: 10.1117/12.2548396
Show Author Affiliations
Zhongyu Wang, Beihang Univ. (China)
Published in SPIE Proceedings Vol. 11343:
Ninth International Symposium on Precision Mechanical Measurements
Liandong Yu, Editor(s)
© SPIE. Terms of Use
