Share Email Print
cover

Proceedings Paper

Adaptive acquisition and modeling for free-form surface with structured-light vision sensor
Author(s): Kangning Chen; Hang Chen; Zhigang Liu
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

Rapid and high precision data acquisition methodology from coordinate components with free-form surface and geometrical model can be implemented widely. Typical application covers part localization, automatic calibration and reverse engineering. Integrated structured light vision sensor with Cmm (Coordinate Measurement Machine) enhances the high- precision coordinate measurement capability. In this paper a curvature-based adaptive sampling approach and the evaluating index for the sampling precision are presented. The matching and subdividing algorithm for generating matrix-type mesh data from sample points is described. The methodology to register and merge the measured data from multiple viewpoints to model the free-form surface is also presented. Based on the given initial coordinate rotation matrix R and transformation vector T, the different viewpoints can be translated into a unique frame of reference. By introducing special coordinate of 3D space, the registered data is divided into mesh, which cover the whole surface of object. An application example for shoe modeling is described to illustrate the advantages.

Paper Details

Date Published: 16 October 2000
PDF: 9 pages
Proc. SPIE 4196, Sensor Fusion and Decentralized Control in Robotic Systems III, (16 October 2000); doi: 10.1117/12.403730
Show Author Affiliations
Kangning Chen, Xi'an Jiaotong Univ. (China)
Hang Chen, Gintic Institute of Manufacturing Technology (Singapore)
Zhigang Liu, Xi'an Jiaotong Univ. (China)


Published in SPIE Proceedings Vol. 4196:
Sensor Fusion and Decentralized Control in Robotic Systems III
Gerard T. McKee; Paul S. Schenker, Editor(s)

© SPIE. Terms of Use
Back to Top