Spatial-Temporal Graph Attention Network for Video-Based Gait Recognition

Autor: Shiqi Yu, Weizhi An, Xinhui Wu, Edel B. Garcia, Weiyu Guo
Rok vydání: 2020
Předmět:
Zdroj: Lecture Notes in Computer Science ISBN: 9783030412982
ACPR (2)
Popis: Gait is a kind of attractive feature for human identification at a distance. It can be regarded as a kind of temporal signal. At the same time the human body shape can be regarded as the signal in the spatial domain. In the proposed method, we try to extract discriminative feature from video sequences in the spatial and temporal domains by only one network, Spatial-Temporal Graph Attention Network (STGAN). In spatial domain, we designed one branch to select some distinguished regions and enhance their contribution. It can make the network focus on these distinguished regions. We also constructed another branch, a Spatial-Temporal Graph (STG), to discover the relationship between frames and the variation of a region in temporal domain. The proposed method can extract gait feature in the two domains, and the two branches in the model can be trained end to end. The experimental results on two popular datasets, CASIA-B and OU-ISIR Treadmill-B, show the proposed method can improve gait recognition obviously.
Databáze: OpenAIRE