Share Email Print
cover

Journal of Electronic Imaging

New method for unsupervised segmentation of moving objects in infrared videos
Author(s): Chaobo Min; Junju Zhang; Benkang Chang; Baohui Zhang; Yingjie Li
Format Member Price Non-Member Price
PDF $20.00 $25.00

Paper Abstract

A new method for unsupervised segmentation of moving objects in infrared videos is presented. This method consists of two steps: difference image quantization and spatial segmentation. In the first step, the changed pixels in the difference image are quantized to several classes by using Bayes decision. It can be used to cluster the changed pixels belonging to the same moving object together. The pixels of the difference image are replaced by their corresponding class labels, thus forming a class-map of the difference image. In the second step, each class in the class-map is considered as a subset of the possible seeds of moving objects. A self-adaptive region growing method is then used to image segmentation on the basis of these different subsets. One of the focuses of this work is on spatial segmentation, where a criterion is proposed for evaluation of moving object segmentation without ground truth in infrared videos. This criterion is used to evaluate the performance of the segmentation masks grown from different subsets of the possible seeds. The best segmented image is determined to be the final segmentation result. Experiments show the advantage and robustness of the proposed algorithm on real infrared videos.

Paper Details

Date Published: 16 December 2013
PDF: 14 pages
J. Electron. Imaging. 22(4) 043026 doi: 10.1117/1.JEI.22.4.043026
Published in: Journal of Electronic Imaging Volume 22, Issue 4
Show Author Affiliations
Chaobo Min, Nanjing Univ. of Science and Technology (China)
Junju Zhang, Nanjing Univ. of Science and Technology (China)
Benkang Chang, Nanjing Univ. of Science and Technology (China)
Baohui Zhang, Nanjing Univ. of Science and Technology (China)
Yingjie Li, Nanjing Univ. of Science and Technology (China)


© SPIE. Terms of Use
Back to Top