Learning to Generalize Unseen Domains via Memory-based Multi-Source Meta-Learning for Person Re-Identification
Autor: | Fengxiang Yang, Yuyang Zhao, Zhiming Luo, Nicu Sebe, Shaozi Li, Zhun Zhong, Yaojin Lin |
---|---|
Rok vydání: | 2021 |
Předmět: |
FOS: Computer and information sciences
Meta learning (computer science) Generalization business.industry Computer science Computer Vision and Pattern Recognition (cs.CV) Computer Science - Computer Vision and Pattern Recognition Data modeling Domain (software engineering) Identification (information) Pattern recognition (psychology) Unsupervised learning Artificial intelligence business Classifier (UML) |
Zdroj: | CVPR 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) |
Popis: | Recent advances in person re-identification (ReID) obtain impressive accuracy in the supervised and unsupervised learning settings. However, most of the existing methods need to train a new model for a new domain by accessing data. Due to public privacy, the new domain data are not always accessible, leading to a limited applicability of these methods. In this paper, we study the problem of multi-source domain generalization in ReID, which aims to learn a model that can perform well on unseen domains with only several labeled source domains. To address this problem, we propose the Memory-based Multi-Source Meta-Learning (M$^3$L) framework to train a generalizable model for unseen domains. Specifically, a meta-learning strategy is introduced to simulate the train-test process of domain generalization for learning more generalizable models. To overcome the unstable meta-optimization caused by the parametric classifier, we propose a memory-based identification loss that is non-parametric and harmonizes with meta-learning. We also present a meta batch normalization layer (MetaBN) to diversify meta-test features, further establishing the advantage of meta-learning. Experiments demonstrate that our M$^3$L can effectively enhance the generalization ability of the model for unseen domains and can outperform the state-of-the-art methods on four large-scale ReID datasets. Comment: CVPR 2021 |
Databáze: | OpenAIRE |
Externí odkaz: |