Share Email Print
cover

Proceedings Paper

Object tracking via Spatio-Temporal Context learning based on multi-feature fusion in stationary scene
Author(s): Yunfei Cheng; Wu Wang
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

A robust algorithm is proposed for tracking object in dynamic challenges including illumination change, pose variation, and occlusion in stationary scene. To cope with these factors, the Spatio-Temporal Context learning based on Multifeature (MSTC) is integrated within a fusion framework. Different from the original Spatio-Temporal Context learning (STC) algorithm which exploits the low-level features (i.e. image intensity and position) from the target and its surrounding regions, our approach utilize the high-level features like Histogram of Oriented Gradient (HOG) and low-level features for tracker interaction and selection for robust tracking performance in decision level. Experimental results on benchmark datasets demonstrate that the proposed algorithm performs robustly and favorably against the original algorithm.

Paper Details

Date Published: 24 October 2017
PDF: 6 pages
Proc. SPIE 10462, AOPC 2017: Optical Sensing and Imaging Technology and Applications, 104620Z (24 October 2017); doi: 10.1117/12.2283058
Show Author Affiliations
Yunfei Cheng, The Third Research Institute of Ministry of Public Security (China)
Wu Wang, The Third Research Institute of Ministry of Public Security (China)


Published in SPIE Proceedings Vol. 10462:
AOPC 2017: Optical Sensing and Imaging Technology and Applications
Yadong Jiang; Haimei Gong; Weibiao Chen; Jin Li, Editor(s)

© SPIE. Terms of Use
Back to Top