GaitGANv2: Invariant gait feature extraction using generative adversarial networks

Autor: Edel B. Garcia, Weizhi An, Norman Poh, Yongzhen Huang, Shiqi Yu, Haifeng Chen, Rijun Liao
Rok vydání: 2019
Předmět:
Zdroj: Pattern Recognition. 87:179-189
ISSN: 0031-3203
Popis: The performance of gait recognition can be adversely affected by many sources of variation such as view angle, clothing, presence of and type of bag, posture, and occlusion, among others. To extract invariant gait features, we proposed a method called GaitGANv2 which is based on generative adversarial networks (GAN). In the proposed method, a GAN model is taken as a regressor to generate a canonical side view of a walking gait in normal clothing without carrying any bag. A unique advantage of this approach is that, unlike other methods, GaitGANv2 does not need to determine the view angle before generating invariant gait images. Indeed, only one model is needed to account for all possible sources of variation such as with or without carrying accessories and varying degrees of view angle. The most important computational challenge, however, is to address how to retain useful identity information when generating the invariant gait images. To this end, our approach differs from the traditional GAN in that GaitGANv2 contains two discriminators instead of one. They are respectively called fake/real discriminator and identification discriminator. While the first discriminator ensures that the generated gait images are realistic, the second one maintains the human identity information. The proposed GaitGANv2 represents an improvement over GaitGANv1 in that the former adopts a multi-loss strategy to optimize the network to increase the inter-class distance and to reduce the intra-class distance, at the same time. Experimental results show that GaitGANv2 can achieve state-of-the-art performance.
Databáze: OpenAIRE