A variety of 3D-imaging systems exist. However, few standards are available to evaluate the performance of these systems. The National Research Council of Canada's Institute for Information Technology (NRCC-IIT) works with other research institutes in the country and around the world to help define and refine relevant emerging standards. We have developed a series of statistically traceable procedures for evaluating the geometrical performance of a 3D-imaging system. Moreover, we propose using terminology that should be familiar to technicians who regularly use geometrical dimensioning and tolerancing (GD&T) procedures.

We begin with three classes of surface forms—plane, spheres, and freeform surfaces—and assess the precision, trueness, and surface response of a system. Precision is expressed as measurement uncertainty and trueness as measurement error.^{1,2} Surface response is a complex topic and consequently will not be discussed here. Each surface form is provided as a certified reference surface (CRS) with associated certified reference values (CRVs). Test procedures are used to generate values for flatness, roundness, angularity, diameter error, angle error, sphere-spacing error, and unidirectional and bidirectional plane-spacing errors that are statistically linked to a CRS through its CRVs.

Measurement precision represents the spread of measurements about a model of the CRS. A best-fit procedure is used to minimize the uncertainty, and a precision characteristic value is generated to indicate the size of the spread. If the CRS is a plane, the associated GD&T term is ‘flatness’ (*F*).^{3} We define *F* to represent the size of the region within which is found at least 99.7% of measurements generated by the system, similar to a method described in the VDI (Association of German Engineers) standard 2634 Part 2.^{4} If the CRS is a sphere, the associated GD&T term is ‘roundness’ (*R*),^{3} which we define in a similar manner.

The maximum *F* and *R* values generated in the working volume are associated with the system. The top graph of Figure 1 shows *F* and *R* profiles for a 3D-imaging system. Of particular interest to GD&T technicians is the effect of plane orientation on *F*, shown in the bottom graph of the figure. Accordingly, we use the GD&T term ‘angularity’ *A* to represent the largest *F* value generated when the CRS is angled with respect to the depth axis. The repeatability (uncertainty) of *F* and *R* is obtained using repeated measurements, then tested to ensure that it is significantly larger than the corresponding CRV.

**Figure 1. **Flatness and roundness versus depth (top). Flatness versus orientation (bottom).

Measurement trueness represents the difference between a CRV and a measured value. If the CRS is a sphere, the diameter error (*E*_{D}) is the difference between the measured sphere diameter and the corresponding CRV. The sphere-to-sphere distance is described in the VDI 2634 Part 2,^{4} and we define a similar term, ‘sphere-spacing error’ (*E*_{SS}). Plane separation is also an important component. For that reason, we define the unidirectional (*E*_{UPS}) and bidirectional (*E*_{BPS}) plane-spacing errors. Finally, we define the angle error (*E*_{a}) between planes. In all cases the repeatability of the error values is generated so that they can be compared to the appropriate CRV to verify that no error value differs significantly from the reference value.

We have briefly described a series of statistically traceable procedures designed to evaluate the flatness, angularity, roundness, diameter error, sphere-spacing error, unidirectional and bidirectional plane-spacing errors, and angle error of a 3D-imaging system. These procedures describe only geometrical performance. The test suite being developed by the NRCC-IIT includes measures of model fidelity, resolution, and optical properties, associated reference surfaces, and procedures to measure system repeatability, intermediate precision, and reproducibility. These procedures can be tailored for application-specific analysis, and the terminology has been adopted to be familiar to the typical end-user in an industrial environment.

David K. MacKinnon

National Research Council Canada,

Institute for Information Technology

Ottawa, Canada

David MacKinnon is a research associate. A former statistician with a doctorate in systems engineering, he now develops statistically based methods for assessing the performance of 3D-imaging systems.

References:

1. Working Group 2 of the Joint Committee for Guides in Metrology (JCGM), *JCGM 200:2008: International Vocabulary of Metrology: Basic and General Concepts and Associated Terms (VIM),* 3rd. ed., Bureau International des Poids et Mesures, Sèvres, 2008.

2. American Society for Testing and Materials, *ASTM E 2544-08: Standard Terminology for Three-Dimensional (3D) Imaging Systems,* ASTM International, West Conshohocken, PA, 2008.

3. American Society of Mechanical Engineers, *ASME Y14.5.1M-1994: Mathematical Definition of Dimensioning and Tolerancing Principles,* ASME, 2004.

4. Association of German Engineers, *VDI/VDE 2634 Part 2: Optical 3-D Measuring Systems*, Beuth, Berlin, 2002.