Hybrid stereoscopic calibration

New software simultaneously calibrates conventional and omnidirectional cameras in the same stereoscopic rig.
28 June 2011
Guillaume Caron and Damien Eynard

A stereo (or stereoscopic) rig is a device with two or more cameras that makes it possible to simulate human binocular vision and its ability to capture 3D images. The type and number of cameras used depends on the intended application. In the last decade, omnidirectional cameras (standard cameras that point to curved mirrors or use a fisheye lens) have attracted interest because of their wide field of view. However, conventional (perspective) cameras, which have a more limited angle of vision, are still useful. Their images have very high spatial resolution compared to omnidirectional images because, generally, the same number of pixels is used in both cases, although the field of view is smaller in perspective cameras. Therefore, creating a stereo rig with both a perspective and an omnidirectional camera has the potential to merge high spatial resolution with a wide field of view (see Figure 1).

Systems that combine different types of cameras on the same rig are called hybrid stereoscopic systems. Their main application fields are video surveillance and localization or navigation in robotics. For instance, a stereo rig with perspective and fisheye cameras can be mounted on a unmanned aerial vehicle (UAV, see Figure 2) to estimate its altitude using images from both cameras (through a method called a plane-sweeping algorithm).1 On the other hand, the attitude of the aircraft, that is, its orientation relative to a reference line or plane, is determined thanks to the fisheye view and using the horizon line or straight lines in urban environments (such as the edge of a building). The field of view of the fisheye lens is wide enough that it can capture the ground—which is projected in the center of the image—and the horizon line—which appears on the border of the picture. Image processing can then detect this reference line and give the attitude of the UAV. Recently, we combined both views to have a precise and robust estimation of UAV motion.2


Figure 1. Example of a hybrid stereo rig: perspective and fisheye cameras.

Figure 2. A perspective-and-fisheye stereo rig embedded on an unmanned aerial vehicle and its images.

To retrieve 3D information on the environment and motion of a hybrid stereo rig, the device has to be calibrated first, a process that consists in determining the cameras' intrinsic and extrinsic parameters. The relative pose—rotation and translation—between the cameras are the extrinsic variables to be estimated. Intrinsic parameters, on the other hand, are those related to the projection models of the cameras in question. When capturing an image, each point of a 3D scene is projected on the picture through the lens. The projection model is the geometry relationship that relates a point from a scene to a point on the image. Consequently, different types of projection are required depending on the camera, the lens, and eventually the mirror.

In our calibration method, we employ a unified (spherical) projection model: one that is generic enough to adapt to various types of cameras. However, using this unified model with a conventional camera is more complicated than the traditional perspective (planar) projection option, which is also better adapted to camera-motion estimation using image intensities (that is, pixel values). Hence, our method can also be used in stereo rigs that mix a perspective camera based on a planar projection model with an omnidirectional device that has images modeled by a sphere.

We developed the Hybrid Stereoscopic Calibration Software or HySCaS3 (see Figure 3) to easily use our calibration routine.4 The software engine is the extension of a calibration method for a perspective camera that is based on a generalization of virtual visual servoing. This is an optimization technique used to compute the pose of an object or a camera using picture measurements. We generalize virtual visual servoing to deal with any camera projection model and any number of devices combined on a stereo rig.

HySCaS simultaneously estimates intrinsic and extrinsic parameters of stereo rigs composed of N cameras represented by N different projection models. The calibration procedure is classical in the sense that users who are familiar with existing freely available calibration software are able to implement ours. (Examples of such software include the Camera Calibration Toolbox for Matlab, the Omnidirectional Calibration Toolbox, and the calibration tool of the Visual Servoing Platform library.)

Another advantage of our software is the low computation time: the calibration of a perspective and fisheye hybrid stereo rig takes only 2.5 seconds. Calibrating a rig with two perspective cameras takes even less time: 0.5 seconds. Yet another benefit is the fact that HySCaS accepts more than one calibration pattern (a set of 3D points in a plane that is projected to an image plane as input data). Chessboard and dot patterns are available at the moment with a ring to be implemented in the near future. Finally, HySCaS includes perspective and unified projection models. The paraboloidal projection model is also available in the software and others, such as the hyperboloidal one, will soon be integrated.


Figure 3. A screenshot of Hybrid Stereoscopic Calibration Software with two sets of perspective and fisheye images.

Calibration of a perspective camera, an omnidirectional camera, and a stereo rig with two perspective cameras was performed with HySCaS, and the parameters calculated were compared with those determined by other methods.4 (Note that other types of stereo rigs were not possible to calibrate before HySCaS, so comparisons cannot be drawn in those cases.) The results obtained were similar or equivalent to those given by the alternatives, which validates our approach. Using the instrinsic and extrinsic parameters estimated by HySCaS, the perspective and fisheye stereo rig embedded on a UAV allowed estimation of the altitude of the flying robot at a precision of about 98%: 3.2cm of mean error with respect to a laser altimeter for altitudes between 0.55 and 2.2m.

In summary, HySCaS is software capable of calibrating hybrid rigs fast and accurately. We chose to present the particular example of a perspective-and-fisheye stereo rig, but our method is applicable to other configurations. We aim to improve our software in the near future by adding some practical tools such as 3D visualization of results, and the possibility of acquiring images for calibration directly with HySCaS. We also intend to add more fundamental tools such as new projection models, and the ability to estimate distortions induced by some lenses.


Guillaume Caron
Institut National de la Recherche en Informatique et Automatique (INRIA)
Rennes, France
and
Université de Picardie Jules Verne
Amiens, France

Guillaume Caron received his master's from the University of Reims Champagne-Ardennes, France (2007), and a PhD in robotics from the Université de Picardie Jules Verne (2010). He is currently working at INRIA as a postdoc.

Damien Eynard
Laboratoire Modélisation, Information et Systèmes (MIS)
Université de Picardie Jules Verne
Amiens, France

Damien Eynard completed a master's (2007) while working in parallel in the glass industry. He is currently doing a PhD on hybrid vision systems to estimate the navigation parameters of a UAV.


References:
1. D. Eynard, P. Vasseur, C. Demonceaux, V. Fremont, UAV altitude estimation by mixed stereoscopic vision, Proc. IEEE Int'l Conf. Intell. Robots Syst., pp. 646-651, 2011. doi:10.1109/IROS.2010.5652254
2. D. Eynard, P. Vasseur, C. Demonceaux, V. Fremont, UAV motion estimation using hybrid stereoscopic vision, 2011. Paper accepted at the Int'l Assoc. Pattern Recognit. Conf. Machine Vision Appl. in Nara, Japan, 13–15 June 2011.
3. http://www.hyscas.com HyScaS, Hybrid Stereoscopic Calibration Software. Accessed 1 June 2011.
4. G. Caron, D. Eynard, Multiple camera types simultaneous stereo calibration, 2011. Paper accepted at the Int'l Conf. Robot. Automat. in Shanghai, China, 9–13 May 2011.
Recent News
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research