Multi‐head mutual‐attention CycleGAN for unpaired image‐to‐image translation

Autor: Wei Ji, Jing Guo, Yun Li
Jazyk: angličtina
Rok vydání: 2020
Předmět:
Zdroj: IET Image Processing, Vol 14, Iss 11, Pp 2395-2402 (2020)
Druh dokumentu: article
ISSN: 1751-9667
1751-9659
DOI: 10.1049/iet-ipr.2019.1153
Popis: The image‐to‐image translation, i.e. from source image domain to target image domain, has made significant progress in recent years. The most popular method for unpaired image‐to‐image translation is CycleGAN. However, it always cannot accurately and rapidly learn the key features in target domains. So, the CycleGAN model learns slowly and the translation quality needs to be improved. In this study, a multi‐head mutual‐attention CycleGAN (MMA‐CycleGAN) model is proposed for unpaired image‐to‐image translation. In MMA‐CycleGAN, the cycle‐consistency loss and adversarial loss in CycleGAN are still used, but a mutual‐attention (MA) mechanism is introduced, which allows attention‐driven, long‐range dependency modelling between the two image domains. Moreover, to efficiently deal with the large image size, the MA is further improved to the multi‐head mutual‐attention (MMA) mechanism. On the other hand, domain labels are adopted to simplify the MMA‐CycleGAN architecture, so only one generator is required to perform bidirectional translation tasks. Experiments on multiple datasets demonstrate MMA‐CycleGAN is able to learn rapidly and obtain photo‐realistic images in a shorter time than CycleGAN.
Databáze: Directory of Open Access Journals