Share Email Print

Proceedings Paper

A robust visual tracking via nonlocal correlation filters
Author(s): Yanxia Wei; Zhen Jiang; Dongxun Chen
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Visual object tracking has become increasingly popular in the community due to its application and research significance. However, occlusion is one of the major factors that seriously impact the tracking performance in visual tracking. To address this issue, in this paper, we propose a novel nonlocal correlation filter based tracking method. Our proposed tracker effectively exploits the explicit coupled mechanism which depends on the global filter and several local part filters, and efficiently employs the spatial geometric constraints among the global object and local patches of object for preserving the structure of object. Compared with other existing correlation filter based trackers, our proposed tracking method has three advantages: (1) To ensure complete representation to the target candidate, we learn the correlation filers from not only the global sample but also local sample parts. The global based filter guarantees the overall accuracy of the tracked object, while the local based filters reserve the details of tracking object to cope with the challenging cases like occlusion or deformation. In addition, an effective and adaptive selection mechanism is proposed to select the most distinctive and discriminative parts for tracking, which avoids unnecessary computing burden caused by tracking all parts and simultaneously improves the robustness of the tracker. (2) Through adaptively weighting the global sample and each local part of samples, the integration mechanism puts more emphasis on visible parts and eliminates the impacts by occluded parts for further improving the tracking robustness. (3) Different from other trackers by searching for the predefined scale pyramid, we propose a simple yet effective scale estimation strategy which can accurately calculate the current scale of the tracking target. For verifying our method, we conduct extensive qualitative and quantitative experiments on challenging benchmark image sequences. Experiment results demonstrate that our proposed method performs favorably against several state-of-the-art trackers.

Paper Details

Date Published: 16 October 2019
PDF: 7 pages
Proc. SPIE 11205, Seventh International Conference on Optical and Photonic Engineering (icOPEN 2019), 112050R (16 October 2019); doi: 10.1117/12.2542125
Show Author Affiliations
Yanxia Wei, Shanghai Univ. (China)
Liaocheng Univ. (China)
Zhen Jiang, Shanghai Univ. (China)
Dongxun Chen, Shanghai Univ. (China)

Published in SPIE Proceedings Vol. 11205:
Seventh International Conference on Optical and Photonic Engineering (icOPEN 2019)
Anand Asundi; Motoharu Fujigaki; Huimin Xie; Qican Zhang; Song Zhang; Jianguo Zhu; Qian Kemao, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?