Exploiting Invariance of Mining Facial Landmarks

Autor: Zekuan Yu, Jiangming Shi, Zixian Gao, Fengjun Li, Hao Liu
Rok vydání: 2021
Předmět:
Zdroj: ACM Multimedia
DOI: 10.1145/3474085.3475582
Popis: In this paper, we propose an invariant learning method for facial landmark mining in a self-supervised manner. The conventional methods mostly train with raw data of paired facial appearances and landmarks, assuming that they are evenly distributed. However, assumptions like this tend to lead to failures in challenging cases even undergo costly training since they usually don't hold in real-world scenarios. To address this issue, our model achieves to be invariant to facial biases by learning through the landmark-anchored distributions. Specifically, we generate faces from these distributions, then group them based on the appearance sources and the probe facial landmarks into intra-identities and intra-landmarks classes, respectively. Thus, we construct intra-class invariance losses to disentangle the spatial structures from appearances. In addition, we adopt a reconstruction loss to produce more realistic faces with probe landmarks. Extensive experimental results on four standard facial landmark datasets demonstrate that our method achieves compelling performance compared with supervised and unsupervised methods.
Databáze: OpenAIRE