Image synthesis with adversarial networks: A comprehensive survey and case studies

Autor: Huiyu Zhou, Jie Yang, Ruili Wang, Pourya Shamsolmoali, M. Emre Celebi, Eric Granger, Masoumeh Zareapoor
Rok vydání: 2021
Předmět:
Zdroj: Information Fusion. 72:126-146
ISSN: 1566-2535
DOI: 10.1016/j.inffus.2021.02.014
Popis: Generative Adversarial Networks (GANs) have been extremely successful in various application domains such as computer vision, medicine, and natural language processing. Moreover, transforming an object or person to a desired shape become a well-studied research in the GANs. GANs are powerful models for learning complex distributions to synthesize semantically meaningful samples. However, there is a lack of comprehensive review in this field, especially lack of a collection of GANs loss-variant, evaluation metrics, remedies for diverse image generation, and stable training. Given the current fast GANs development, in this survey, we provide a comprehensive review of adversarial models for image synthesis. We summarize the synthetic image generation methods, and discuss the categories including image-to-image translation, fusion image generation, label-to-image mapping, and text-to-image translation. We organize the literature based on their base models, developed ideas related to architectures, constraints, loss functions, evaluation metrics, and training datasets. We present milestones of adversarial models, review an extensive selection of previous works in various categories, and present insights on the development route from the model-based to data-driven methods. Further, we highlight a range of potential future research directions. One of the unique features of this review is that all software implementations of these GAN methods and datasets have been collected and made available in one place at https://github.com/pshams55/GAN-Case-Study .
Databáze: OpenAIRE