SPIE Membership Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
SPIE Photonics West 2019 | Register Today

SPIE Defense + Commercial Sensing 2019 | Call for Papers

2019 SPIE Optics + Photonics | Call for Papers



Print PageEmail PageView PDF

Sensing & Measurement

Improving accuracy of terrestrial laser scanners and range cameras

Measurement devices for direct observation of 3D coordinates using active ranging show systematic errors, but range calibration methodologies have great potential for error removal.
7 July 2009, SPIE Newsroom. DOI: 10.1117/2.1200906.1694

Acquiring 3D information for the geometric modeling of objects (up to 100m) is essential for applications like industrial site reconstruction, cultural heritage monitoring, or city modeling. The accuracy requirement is between ±0.5mm and ±2cm, depending on the application. To meet these application demands, two measurement devices used in such applications—laser scanners and range cameras—require improved accuracy.

While laser scanners and range cameras measure angles and ranges, the results are often processed as 3D point clouds, i.e., sets of points in three-dimensional Cartesian space. It is possible to break down the total error budget of the measurements into random and systematic errors.1 Random errors and their influence can be reduced by exploiting redundancy in the measurements, such as fitting surfaces to 3D points in the model reconstruction step. Systematic errors, on the other hand, are deterministic deviations from the true value, and have the same size under identical measurement conditions. The removal of systematic errors, therefore, requires an error model obtained by comparison to external or internal references.

Range calibration methodologies have the ability to reduce systematic errors in practical cases by 50% or more. We specifically provide methods using internal references, thus allowing for on-the-job calibration. Figure 1 shows laser scanning data gathered in a room of Schönbrunn Palace in Vienna, both before and after applying such a calibration.

Figure 1. A laser scanning point cloud is partitioned into patches, and residuals to an adjusting plane are colored according to the distance from the measured points to the plane. [left] The typical pattern of periodic range errors becomes visible, featuring a magnitude of a few millimeters. [right] The systematic errors have been removed successfully by range calibration.2
Terrestrial Laser Scanners

For scenes with a depth of 1m to 1km, terrestrial laser scanners3 allow direct observation of ranges. They measure the distance between sensor and object by emitting a modulated laser signal, and also by performing a 2D angular deflection of the emitted beam.

The ranging components of commercially available laser scanners provide accuracies on the order of 1:10,000 (1mm accuracy for 10m range). The majority of the suggested calibration schemes4–6 use a controlled environment with a network of known reference points and patches, respectively. Thus, redundancy is governed by the number of artificial targets placed in the scene. The error models used in these approaches incorporate physically interpretable sources also used for calibrating electronic distance measurement units of geodetic total stations,7 and add some empirically derived parameters.4

However, the massive over-determination provided by dense point clouds (2mm point spacing at 10m distance) can also be used to extract calibration functions using weak assumptions on the object itself. For example, in the documentation of buildings, this leads to the requirement that surfaces be described by planar patches of a few meters.

What this means is that local plane parameters, exterior orientations of the laser scanner, and calibration function parameters are determined simultaneously using least squares adjustment. The planes are found automatically by segmentation. We suggest the use of a linear spline, parameterized over distance, as a calibration function.2,8 The advantage of this method is that it reduces subtle effects, such as multipath between sensor and object. Ultimately, this method can achieve a reduction of systematic errors for amplitude modulating laser scanners from over 1mm to below 0.5mm.

Range cameras

Time-of-flight (TOF) range cameras9 capture the geometry of a whole scene at once using simultaneous range measurements on a focal plane. Though range cameras are poorer in resolution, accuracy, and distance measurement than laser scanners, they feature faster acquisition and are less bulky.

Off-the-shelf TOF range imaging cameras based on photonic mixing device10 technology provide accuracies of less than 1:1,000, with a maximum range on the order of a few 10m. Their principal advantages lie in the high frame rate, the low weight and power consumption, and their compactness. Nevertheless, the stability of these instruments and their calibration is especially in question and still requires a comprehensive overview of systematic effects. These shortcomings call for on-the-job elimination of systematic errors.11,12

Essentially, two technical challenges remain. First, simultaneously determining the exterior orientation of the range video stream and the internal orientation (calibration) is hampered by correlations between the two. To date, no one has introduced any rigorous simultaneous approaches. Second, distortions of the internal orientation depend on various factors: range, intensity, position in the image plane, surface orientation towards the camera, object brightness, and camera orientation with regard to the gravity vector.

Look-up tables providing averaged correction values could be used. However, they are difficult to construct in the presence of outliers, and in a high-dimensional, sparsely populated parameter space. They also lack the ability to extrapolate calibration values. This is why many prefer models that use basic functions with a limited number of parameters. However, only a very limited physical justification has been found for these models so far. Practical examples have demonstrated a reduction of systematic errors from 2cm to 5mm. Figure 2 shows the modeled impact of the sensor position on range measurements.

Figure 2. Distortion of range measurements of a time-of-flight range imaging camera, as a function of sensor position. [left] Range residuals, having applied the corrections for all distortion factors except sensor position. [right] Range residuals, having applied all corrections, i.e., calibration residuals.11

Using range calibration methodologies, we can reduce the systematic errors in laser scanning point clouds by more than 50% and improve overall accuracy. Although we can reduce systematic errors for range imaging cameras, residual systematic errors remain considerable. Ideally, technological advances in range cameras will increase precision and resolution, but for now, a solution is still required for the calibration task. In the meantime, we will investigate integration of other sensors for observing elements of the outer orientation (the angular attitude of the device) as a possible solution.

This work was presented in the conference Videometrics, Range Imaging and Applications at the SPIE Optics + Photonics symposium in August 2009 in San Diego.

Norbert Pfeifer, Camillo Ressl, Wilfried Karel
Institute of Photogrammetry and Remote Sensing
Vienna University of Technology
Vienna, Austria

Norbert Pfeifer received his PhD from the Vienna University of Technology in 2002. He was a postdoctorate researcher at the Vienna University of Technology, the Delft University of Technology, and the Competence Centre alpS, Centre for Natural Hazard Management, Innsbruck, Austria. Currently, he is a professor of photogrammetry. His research interests are topographic and 3D modeling, laser ranging and scanning, and photogrammetry.

Camillo Ressl received his PhD in geodesy and works with spatial data from laser scanning and remote sensing as collaborator at the Christian Doppler Laboratory at the Institute of Photogrammetry and Remote Sensing. He is active in the International Society of Photogrammetry and Remote Sensing, and in the European Spatial Data Research Network. His research interests are orientation and reconstruction from data of different optical (active and passive) sensors and mathematical models.

Wilfried Karel received his master's at the Vienna University of Technology and is pursuing his PhD on the subject of range imaging cameras. Besides software engineering, his research interests include computer vision, sensor modeling, and automated model reconstruction.