Rare Events Detection and Localization In Crowded Scenes Based On Flow Signature

Autor: Bruno Emile, Dieudonné Fabrice Atrevi, Damien Vivet
Přispěvatelé: Institut Supérieur de l'Aéronautique et de l'Espace - ISAE-SUPAERO (FRANCE), Institut National des Sciences Appliquées - INSA (FRANCE), Laboratoire Pluridisciplinaire de Recherche en Ingénierie des Systèmes, Mécanique et Energétique (PRISME), Université d'Orléans (UO)-Ecole Nationale Supérieure d'Ingénieurs de Bourges (ENSI Bourges), Institut Supérieur de l'Aéronautique et de l'Espace (ISAE-SUPAERO)
Jazyk: angličtina
Rok vydání: 2019
Předmět:
Zdroj: 2019 Ninth International Conference on Image Processing Theory, Tools and Applications (IPTA)
2019 Ninth International Conference on Image Processing Theory, Tools and Applications (IPTA), Nov 2019, Istanbul, Turkey. pp.1-6, ⟨10.1109/IPTA.2019.8936073⟩
IPTA
DOI: 10.1109/IPTA.2019.8936073⟩
Popis: We introduce in this paper a novel method for rare events detection based on the optical flow signature. It aims to automatically highlight regions in videos where rare events are occurring. This kind of method can be used as an important step for many applications such as Closed-Circuit Television (CCTV) monitoring systems in order to reduce the cognitive effort of the operators by focusing their attention on the interesting regions. The proposed method exploits the properties of the Discrete Cosine Transform (DCT) applied to the magnitude and orientation maps of the optical flow. The output of the algorithm is a map where each pixel has a saliency score that indicates the presence of irregular motion regard to the scene. Based on the one class Support Vectors Machine (SVM) algorithm, a model of the frequent events is created and the rare events detection can be performed by using this model. The DCT is faster, easy to compute and gives interesting information to detect spatial irregular patterns in images [1]. Our method does not rely on any prior information of the scene and uses the saliency score as a feature descriptor. We demonstrate the potential of the proposed method on the publicly available videos dataset UCSD and show that it is competitive and outperforms some the state-of-the-art methods.
Databáze: OpenAIRE