A multi-temporal framework for high-level activity analysis: Violent event detection in visual surveillance
Autor: | Donghui Song, Chansu Kim, Sung-Kee Park |
---|---|
Rok vydání: | 2018 |
Předmět: |
Information Systems and Management
Computer science business.industry media_common.quotation_subject 020207 software engineering Pattern recognition Level activity 02 engineering and technology Computer Science Applications Theoretical Computer Science Support vector machine Visual surveillance Artificial Intelligence Control and Systems Engineering Perception 0202 electrical engineering electronic engineering information engineering Graph (abstract data type) 020201 artificial intelligence & image processing Artificial intelligence business Software media_common |
Zdroj: | Information Sciences. 447:83-103 |
ISSN: | 0020-0255 |
DOI: | 10.1016/j.ins.2018.02.065 |
Popis: | This paper presents a novel framework for high-level activity analysis based on late fusion using multi-independent temporal perception layers. The method allows us to handle temporal diversity of high-level activities. The framework consists of multi-temporal analysis, multi-temporal perception layers, and late fusion. We build two types of perception layers based on situation graph trees (SGT) and support vector machines (SVMs). The results obtained from the multi-temporal perception layers are fused into an activity score through a step of late fusion. To verify this approach, we apply the framework to violent events detection in visual surveillance and experiments are conducted by using three datasets: BEHAVE, NUS–HGA and some videos from YouTube that show real situations. We also compare the proposed framework with existing single-temporal frameworks. The experiments produced results with accuracy of 0.783 (SGT-based, BEHAVE), 0.702 (SVM-based, BEHAVE), 0.872 (SGT-based, NUS–HGA), and 0.699 (SGT-based, YouTube), thereby showing that using our multi-temporal approach has advantages over single-temporal methods. |
Databáze: | OpenAIRE |
Externí odkaz: |