A Multisource Heterogeneous Data Fusion Method for Pedestrian Tracking
Autor: | Baocai Yin, Yanfeng Sun, Zhenlian Shi, Yongli Hu, Linxin Xiong |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2015 |
Předmět: |
Article Subject
business.industry Computer science General Mathematics lcsh:Mathematics General Engineering ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION Tracking system Sensor fusion Tracking (particle physics) lcsh:QA1-939 Feature (computer vision) lcsh:TA1-2040 Video tracking RGB color model Eye tracking Computer vision Artificial intelligence business lcsh:Engineering (General). Civil engineering (General) Camera resectioning |
Zdroj: | Mathematical Problems in Engineering, Vol 2015 (2015) |
ISSN: | 1563-5147 |
Popis: | Traditional visual pedestrian tracking methods perform poorly when faced with problems such as occlusion, illumination changes, and complex backgrounds. In principle, collecting more sensing information should resolve these issues. However, it is extremely challenging to properly fuse different sensing information to achieve accurate tracking results. In this study, we develop a pedestrian tracking method for fusing multisource heterogeneous sensing information, including video, RGB-D sequences, and inertial sensor data. In our method, a RGB-D sequence is used to position the target locally by fusing the texture and depth features. The local position is then used to eliminate the cumulative error resulting from the inertial sensor positioning. A camera calibration process is used to map the inertial sensor position onto the video image plane, where the visual tracking position and the mapped position are fused using a similarity feature to obtain accurate tracking results. Experiments using real scenarios show that the developed method outperforms the existing tracking method, which uses only a single sensing dataset, and is robust to target occlusion, illumination changes, and interference from similar textures or complex backgrounds. |
Databáze: | OpenAIRE |
Externí odkaz: |