Distractor-Aware Deep Regression for Visual Tracking

Autor: Ming Du, Yan Ding, Xiuyun Meng, Hua-Liang Wei, Yifan Zhao
Jazyk: angličtina
Rok vydání: 2019
Předmět:
Zdroj: Sensors, Vol 19, Iss 2, p 387 (2019)
Druh dokumentu: article
ISSN: 1424-8220
DOI: 10.3390/s19020387
Popis: In recent years, regression trackers have drawn increasing attention in the visual-object tracking community due to their favorable performance and easy implementation. The tracker algorithms directly learn mapping from dense samples around the target object to Gaussian-like soft labels. However, in many real applications, when applied to test data, the extreme imbalanced distribution of training samples usually hinders the robustness and accuracy of regression trackers. In this paper, we propose a novel effective distractor-aware loss function to balance this issue by highlighting the significant domain and by severely penalizing the pure background. In addition, we introduce a full differentiable hierarchy-normalized concatenation connection to exploit abstractions across multiple convolutional layers. Extensive experiments were conducted on five challenging benchmark-tracking datasets, that is, OTB-13, OTB-15, TC-128, UAV-123, and VOT17. The experimental results are promising and show that the proposed tracker performs much better than nearly all the compared state-of-the-art approaches.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje