TN-ZSTAD: Transferable Network for Zero-Shot Temporal Activity Detection
Autor: | Lingling Zhang, Xiaojun Chang, Jun Liu, Minnan Luo, Zhihui Li, Lina Yao, Alex Hauptmann |
---|---|
Rok vydání: | 2022 |
Předmět: | |
Zdroj: | IEEE Transactions on Pattern Analysis and Machine Intelligence. :1-14 |
ISSN: | 1939-3539 0162-8828 |
DOI: | 10.1109/tpami.2022.3183586 |
Popis: | An integral part of video analysis and surveillance is temporal activity detection, which means to simultaneously recognize and localize activities in long untrimmed videos. Currently, the most effective methods of temporal activity detection are based on deep learning, and they typically perform very well with large scale annotated videos for training. However, these methods are limited in real applications due to the unavailable videos about certain activity classes and the time-consuming data annotation. To solve this challenging problem, we propose a novel task setting called zero-shot temporal activity detection (ZSTAD), where activities that have never been seen in training still need to be detected. We design an end-to-end deep transferable network TN-ZSTAD as the architecture for this solution. On the one hand, this network utilizes an activity graph transformer to predict a set of activity instances that appear in the video, rather than produces many activity proposals in advance. On the other hand, this network captures the common semantics of seen and unseen activities from their corresponding label embeddings, and it is optimized with an innovative loss function that considers the classification property on seen activities and the transfer property on unseen activities together. Experiments on the THUMOS'14, Charades, and ActivityNet datasets show promising performance in terms of detecting unseen activities. |
Databáze: | OpenAIRE |
Externí odkaz: |