Learning spatial-temporally regularized complementary kernelized correlation filters for visual tracking

Autor: Chengfang Song, Jun Wan, Yafu Xiao, Jing Li, Jun Chang, Zhenyang Su
Rok vydání: 2020
Předmět:
Zdroj: Multimedia Tools and Applications. 79:25171-25188
ISSN: 1573-7721
1380-7501
DOI: 10.1007/s11042-020-09028-9
Popis: Despite excellent performance shown by spatially regularized discriminative correlation filters (SRDCF) for visual tracking, some issues remain open that hinder further boosting their performance: first, SRDCF utilizes multiple training images to formulate its model, which makes it unable to exploit the circulant structure of the training samples in learning, leading to high computational burden; second, SRDCF is unable to efficiently exploit the powerfully discriminative nonlinear kernels, further negatively affecting its performance. In this paper, we present a novel spatial-temporally regularized complementary kernelized CFs (STRCKCF) based tracking approach. First, by introducing spatial-temporal regularization to the filter learning, the STRCKCF formulates its model with only one training image, which can not only facilitate exploiting the circulant structure in learning, but also reasonably approximate the SRDCF with multiple training images. Furthermore, by incorporating two types of kernels whose matrices are circulant, the STRCKCF is able to fully take advantage of the complementary traits of the color and HOG features to learn a robust target representation efficiently. Besides, our STRCKCF can be efficiently optimized via the alternating direction method of multipliers (ADMM). Extensive evaluations on OTB100 and VOT2016 visual tracking benchmarks demonstrate that the proposed method achieves favorable performance against state-of-the-art trackers with a speed of 40 fps on a single CPU. Compared with SRDCF, STRCKCF provides a 8 × speedup and achieves a gain of 5.5% AUC score on OTB100 and 8.4% EAO score on VOT2016.
Databáze: OpenAIRE