A Novel Spatial and Temporal Context-Aware Approach for Drone-Based Video Object Detection
Autor: | Zhaoliang Pi, Yingping Li, Xier Chen, Yanchao Lian, Licheng Jiao, Wu Yinan |
---|---|
Rok vydání: | 2019 |
Předmět: |
Computer science
business.industry Detector Motion blur Feature extraction ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION Context (language use) 02 engineering and technology 010501 environmental sciences Tracking (particle physics) 01 natural sciences Object detection Drone 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Computer vision Artificial intelligence business 0105 earth and related environmental sciences |
Zdroj: | ICCV Workshops |
DOI: | 10.1109/iccvw.2019.00027 |
Popis: | Nowadays, with the advent of Unmanned Aerial Vehicles (UAV), drones equipped with cameras have been fast deployed to a wide range of applications. Consequently, automatic and effective object detection plays an important role in understanding and analysis of visual data collected from the drones, which could be further applied to civilian and military fields. However, various challenges still exist in object detection of drone-based videos, such as defocus, motion blur, occlusion and various variations (e.g., illumination, view and size), leaving too weak visual clues for successful detections. In this paper, we propose a novel approach for object detection in drone-based videos, which includes the multi-model fusion detection, an efficient tracker and a new evaluation method for confidence of the track, and the false positive analysis with scene-level context information and inferences. The experimental results on VisDrone2018-VID [44] dataset demonstrate the effectiveness of the proposed approach. |
Databáze: | OpenAIRE |
Externí odkaz: |