An Assessment of GANs for Identity-related Applications
Autor: | Safa Madiouni, Liming Chen, Sami Romdhani, Richard T. Marriott, Stéphane Gentric |
---|---|
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
Biometrics Computer science media_common.quotation_subject Computer Vision and Pattern Recognition (cs.CV) Computer Science - Computer Vision and Pattern Recognition 02 engineering and technology 010501 environmental sciences Overfitting Machine learning computer.software_genre 01 natural sciences Facial recognition system 0202 electrical engineering electronic engineering information engineering Quality (business) 0105 earth and related environmental sciences media_common business.industry Variety (cybernetics) Face (geometry) Identity (object-oriented programming) 020201 artificial intelligence & image processing Artificial intelligence business computer Meaning (linguistics) |
Zdroj: | IJCB |
DOI: | 10.48550/arxiv.2012.10553 |
Popis: | Generative Adversarial Networks (GANs) are now capable of producing synthetic face images of exceptionally high visual quality. In parallel to the development of GANs themselves, efforts have been made to develop metrics to objectively assess the characteristics of the synthetic images, mainly focusing on visual quality and the variety of images. Little work has been done, however, to assess overfitting of GANs and their ability to generate new identities. In this paper we apply a state of the art biometric network to various datasets of synthetic images and perform a thorough assessment of their identity-related characteristics. We conclude that GANs can indeed be used to generate new, imagined identities meaning that applications such as anonymisation of image sets and augmentation of training datasets with distractor images are viable applications. We also assess the ability of GANs to disentangle identity from other image characteristics and propose a novel GAN triplet loss that we show to improve this disentanglement. Comment: Presented at IJCB 2020 (oral) |
Databáze: | OpenAIRE |
Externí odkaz: |