Share Email Print
cover

Optical Engineering • Open Access

Depth error compensation for camera fusion system
Author(s): Cheon Lee; Sung-Yeol Kim; Byeongho Choi; Yong-Moo Kwon; Yo-Sung Ho

Paper Abstract

When the three-dimensional (3-D) video system includes a multiview video generation technique using depth data to provide more realistic 3-D viewing experiences, accurate depth map acquisition is an important task. In order to generate the precise depth map in real time, we can build a camera fusion system with multiple color cameras and one time-of-flight (TOF) camera; however, this method is associated with depth errors, such as depth flickering, empty holes in the warped depth map, and mixed pixels around object boundaries. In this paper, we propose three different methods for depth error reduction to minimize such depth errors. In order to reduce depth flickering in the temporal domain, we propose a temporal enhancement method using a modified joint bilateral filtering at the TOF camera side. Then, we fill the empty holes in the warped depth map by selecting a virtual depth and applying a weighted depth filtering method. After hole filling, we remove mixed pixels and replace them with new depth values using an adaptive joint multilateral filter. Experimental results show that the proposed method reduces depth errors significantly in near real time.

Paper Details

Date Published: 9 July 2013
PDF: 14 pages
Opt. Eng. 52(7) 073103 doi: 10.1117/1.OE.52.7.073103
Published in: Optical Engineering Volume 52, Issue 7
Show Author Affiliations
Cheon Lee, Gwangju Institute of Science and Technology (Korea, Republic of)
Sung-Yeol Kim, Gwangju Institute of Science and Technology (Korea, Republic of)
Byeongho Choi, Korea Electronics Technology Institute (Korea, Republic of)
Yong-Moo Kwon, Korea Institute of Science and Technology (Korea, Republic of)
Yo-Sung Ho, Gwangju Institute of Science and Technology (Korea, Republic of)


© SPIE. Terms of Use
Back to Top