LaSOT: A High-quality Benchmark for Large-scale Single Object Tracking
Autor: | Haibin Ling, Chunyuan Liao, Sijia Yu, Liting Lin, Heng Fan, Yong Xu, Hexin Bai, Fan Yang, Ge Deng, Peng Chu |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2018 |
Předmět: |
FOS: Computer and information sciences
Computer science business.industry Computer Vision and Pattern Recognition (cs.CV) Frame (networking) Computer Science - Computer Vision and Pattern Recognition 020207 software engineering 02 engineering and technology Minimum bounding box Video tracking 0202 electrical engineering electronic engineering information engineering Benchmark (computing) 020201 artificial intelligence & image processing Computer vision Artificial intelligence business Feature learning |
Zdroj: | CVPR |
Popis: | In this paper, we present LaSOT, a high-quality benchmark for Large-scale Single Object Tracking. LaSOT consists of 1,400 sequences with more than 3.5M frames in total. Each frame in these sequences is carefully and manually annotated with a bounding box, making LaSOT the largest, to the best of our knowledge, densely annotated tracking benchmark. The average video length of LaSOT is more than 2,500 frames, and each sequence comprises various challenges deriving from the wild where target objects may disappear and re-appear again in the view. By releasing LaSOT, we expect to provide the community with a large-scale dedicated benchmark with high quality for both the training of deep trackers and the veritable evaluation of tracking algorithms. Moreover, considering the close connections of visual appearance and natural language, we enrich LaSOT by providing additional language specification, aiming at encouraging the exploration of natural linguistic feature for tracking. A thorough experimental evaluation of 35 tracking algorithms on LaSOT is presented with detailed analysis, and the results demonstrate that there is still a big room for improvements. 18 pages, including supplementary material, adding minor revisions and correcting typos |
Databáze: | OpenAIRE |
Externí odkaz: |