Share Email Print
cover

Optical Engineering

Improved moving object segmentation by multiresolution and variable thresholding
Author(s): Hsien-Huang Peter Wu; Jen-Hung Chang; Ping-Kuo Weng; Ying-Yih Wu
Format Member Price Non-Member Price
PDF $20.00 $25.00

Paper Abstract

Segmentation of moving objects in image sequences by change detection has been a very important topic in multimedia and surveillance applications. One popular approach is to model the background of the scene and then threshold the differences between the background and the input images to detect the change caused by the moving object. Although this idea is simple and effective, the selection of a proper threshold value faces a trade-off between false alarms and misdetection. In this paper, a new method of thresholding for moving object segmentation in scenes without dynamic background and rapid variation of illumination is proposed to avoid misdetection while reducing false alarms. This new approach adopts the concept of thresholding-with-hysteresis, which utilizes a multiresolution and variable thresholding (MRVT) scheme to achieve improvements on the segmentation performance. Combined with a module of shadow removal, MRVT can generate accurate moving object masks. Segmentation results are evaluated qualitatively and quantitatively for indoor scenes, and the effectiveness of MRVT is encouraging. Compared with two other state-of-the-art approaches, our proposed method can achieve more accurate object boundary, fewer false alarms, and reduced fragmentation of objects.

Paper Details

Date Published: 1 November 2006
PDF: 12 pages
Opt. Eng. 45(11) 117003 doi: 10.1117/1.2393227
Published in: Optical Engineering Volume 45, Issue 11
Show Author Affiliations
Hsien-Huang Peter Wu, National Yunlin Univ. of Science and Technology (Taiwan)
Jen-Hung Chang, National Yunlin Univ. of Science and Technology (Taiwan)
Ping-Kuo Weng, Chung Shan Institute of Science and Technology (Taiwan)
Ying-Yih Wu, Chung Shan Institute of Science and Technology (Taiwan)


© SPIE. Terms of Use
Back to Top