
Proceedings Paper
Disparity fusion using depth and stereo cameras for accurate stereo correspondenceFormat | Member Price | Non-Member Price |
---|---|---|
$17.00 | $21.00 |
Paper Abstract
Three-dimensional content (3D) creation has received a lot of attention due to numerous successes of 3D entertainment. Accurate stereo correspondence is necessary for efficient 3D content creation. In this paper, we propose a disparity map estimation method based on stereo correspondence. The proposed system utilizes depth and stereo camera sets. While the stereo set carries out disparity estimation, depth camera information is projected to left and right camera positions using 3D transformation and upsampling is processed in accordance with the image size. The upsampled depth is used for obtaining disparity data of left and right positions. Finally, disparity data from each depth sensor are combined. In order to evaluate the proposed method, we applied view synthesis from the acquired disparity map. The experimental results demonstrate that our method produces more accurate disparity maps compared to the conventional approaches which use the single depth sensors.
Paper Details
Date Published: 17 March 2015
PDF: 8 pages
Proc. SPIE 9393, Three-Dimensional Image Processing, Measurement (3DIPM), and Applications 2015, 93930T (17 March 2015); doi: 10.1117/12.2078665
Published in SPIE Proceedings Vol. 9393:
Three-Dimensional Image Processing, Measurement (3DIPM), and Applications 2015
Robert Sitnik; William Puech, Editor(s)
PDF: 8 pages
Proc. SPIE 9393, Three-Dimensional Image Processing, Measurement (3DIPM), and Applications 2015, 93930T (17 March 2015); doi: 10.1117/12.2078665
Show Author Affiliations
Woo-Seok Jang, Gwangju Institute of Science and Technology (Korea, Republic of)
Yo-Sung Ho, Gwangju Institute of Science and Technology (Korea, Republic of)
Published in SPIE Proceedings Vol. 9393:
Three-Dimensional Image Processing, Measurement (3DIPM), and Applications 2015
Robert Sitnik; William Puech, Editor(s)
© SPIE. Terms of Use
