Batch Hard Contrastive Loss and Its Application to Cross-View Gait Recognition

Autor: Mohamad Ammar Alsherfawi Aljazaerly, Yasushi Makihara, Daigo Muramatsu, Yasushi Yagi
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: IEEE Access, Vol 11, Pp 31177-31187 (2023)
Druh dokumentu: article
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2023.3262271
Popis: Biometric person authentication comprises two tasks: the identification task (i.e., one-to-many matching) and the verification task (i.e., one-to-one matching). In this paper, we propose a loss function called batch hard contrastive loss (BHCn) for the deep learning-based verification task. For this purpose, we consider batch mining techniques developed in the identification task and translate them to the verification task. More specifically, inspired by batch mining triplet losses to learn a relative distance for the identification task, we propose BHCn to learn an absolute distance that better represents verification in general. Our method preserves the identity-agnostic nature of the contrastive loss by selecting the hardest pair of samples for each pair of identities in a batch instead of selecting the hardest pair for each sample. We validate the effectiveness of the proposed method in cross-view gait recognition using three networks: a lightweight input, structure, and output network we call GEI + CNN (Gait Energy Image Convolutional Neural Network) as well as the widely used GaitSet and GaitGL, which have sophisticated inputs, structures, and outputs. We trained these networks with the publicly available silhouette-based datasets, the OU-ISIR Gait Database Multi-View Large Population (OU-MVLP) dataset and the Institute of Automation Chinese Academy of Sciences Gait Database Multiview (CASIA-B) dataset. Experimental results show that the proposed BHCn outperforms other loss functions, such as a triplet loss with batch mining as well as the conventional contrastive loss.
Databáze: Directory of Open Access Journals