Share Email Print
cover

Proceedings Paper

Dense-disparity estimation from feature correspondences
Author(s): Janusz Konrad; Zhong-Dan Lan
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Stereoscopic disparity plays an important role in the processing and compression of 3D imagery. For example, dense disparity fields are used to reconstruct intermediate images. Although for small camera baselines dense disparity can be reliably estimated using gradient-based methods, this is not the case for large baselines due to the violation of underlying assumptions. Block matching algorithms work better but they are likely to get trapped in a local minimum due to the increased search space. An appropriate method to estimate large disparities is by using feature points. However, since feature points are unique, they are also sparse. In this paper, we propose a disparity estimation method that combines the reliability of feature-based correspondence methods with the resolution of dense approaches. In the first step we find feature points in the left and right images using Harris operator. In the second step, we select those feature points that allow one-to-one left-right correspondence based on a cross-correlation measure. In the third step, we use the computed correspondence points to control the computation of dense disparity via regularized block matching that minimizes matching and disparity smoothness errors. The approach has been tested on several large-baseline stereo pairs with encouraging initial results.

Paper Details

Date Published: 3 May 2000
PDF: 12 pages
Proc. SPIE 3957, Stereoscopic Displays and Virtual Reality Systems VII, (3 May 2000); doi: 10.1117/12.384433
Show Author Affiliations
Janusz Konrad, INRS-Telecommunications (United States)
Zhong-Dan Lan, INRS-Telecommunications (Canada)


Published in SPIE Proceedings Vol. 3957:
Stereoscopic Displays and Virtual Reality Systems VII
John O. Merritt; Mark T. Bolas; Stephen A. Benton; Andrew J. Woods; Mark T. Bolas, Editor(s)

© SPIE. Terms of Use
Back to Top