Share Email Print

Proceedings Paper

Convolutional residual learning with sparse robust samples and multi-feature fusion for object tracking
Author(s): Huiling Gao; Jie Liu; Chaorong Liu; Binshan Li; Zhengtian Zhao; Weirong Liu
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Recently, discriminative object trackers based on deep learning have demonstrated excellent performance. However, the tracking accuracy is facing a challenge due to contaminated training samples and different complex scenarios. For this reason, we propose a tracker based on sparse robust samples and convolutional residual learning with multi-feature fusion (SR_MFCRL). First, a sparse robust sample set (SRSS) is introduced to improve robustness of the network. In this process, we first employ sparse representation to estimate the best candidate and then utilize joint detection with response peak value and occlusion detection to determine the contamination degree of the sample. Second, a multifeature fusion residual network (MRN) is proposed and its two base branches to capture response output of different features in order to achieve higher positioning accuracy. Extensive experimental results conducted on OTB-2013 illustrate that the proposed tracker achieves outstanding performance in terms of tracking accuracy and robustness.

Paper Details

Date Published: 6 May 2019
PDF: 8 pages
Proc. SPIE 11069, Tenth International Conference on Graphics and Image Processing (ICGIP 2018), 110690R (6 May 2019); doi: 10.1117/12.2524414
Show Author Affiliations
Huiling Gao, Lanzhou Univ. of Technology (China)
Jie Liu, Lanzhou Univ. of Technology (China)
Chaorong Liu, Lanzhou Univ. of Technology (China)
Binshan Li, Lanzhou Univ. of Technology (China)
Zhengtian Zhao, Lanzhou Univ. of Technology (China)
Weirong Liu, Lanzhou Univ. of Technology (China)

Published in SPIE Proceedings Vol. 11069:
Tenth International Conference on Graphics and Image Processing (ICGIP 2018)
Chunming Li; Hui Yu; Zhigeng Pan; Yifei Pu, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?