Spatial-Temporal Graph Attention Network for Video-Based Gait Recognition
Autor: | Shiqi Yu, Weizhi An, Xinhui Wu, Edel B. Garcia, Weiyu Guo |
---|---|
Rok vydání: | 2020 |
Předmět: |
021110 strategic
defence & security studies business.industry Computer science 0211 other engineering and technologies Pattern recognition 02 engineering and technology Discriminative model Attention network 0202 electrical engineering electronic engineering information engineering Graph (abstract data type) 020201 artificial intelligence & image processing Artificial intelligence business Video based |
Zdroj: | Lecture Notes in Computer Science ISBN: 9783030412982 ACPR (2) |
Popis: | Gait is a kind of attractive feature for human identification at a distance. It can be regarded as a kind of temporal signal. At the same time the human body shape can be regarded as the signal in the spatial domain. In the proposed method, we try to extract discriminative feature from video sequences in the spatial and temporal domains by only one network, Spatial-Temporal Graph Attention Network (STGAN). In spatial domain, we designed one branch to select some distinguished regions and enhance their contribution. It can make the network focus on these distinguished regions. We also constructed another branch, a Spatial-Temporal Graph (STG), to discover the relationship between frames and the variation of a region in temporal domain. The proposed method can extract gait feature in the two domains, and the two branches in the model can be trained end to end. The experimental results on two popular datasets, CASIA-B and OU-ISIR Treadmill-B, show the proposed method can improve gait recognition obviously. |
Databáze: | OpenAIRE |
Externí odkaz: |