MADAN: Multi-source Adversarial Domain Aggregation Network for Domain Adaptation

Autor: Kurt Keutzer, Sicheng Zhao, Bo Li, Guiguang Ding, Xiangyu Yue, Pengfei Xu
Rok vydání: 2021
Předmět:
Zdroj: International Journal of Computer Vision. 129:2399-2424
ISSN: 1573-1405
0920-5691
DOI: 10.1007/s11263-021-01479-3
Popis: Domain adaptation aims to learn a transferable model to bridge the domain shift between one labeled source domain and another sparsely labeled or unlabeled target domain. Since the labeled data may be collected from multiple sources, multi-source domain adaptation (MDA) has attracted increasing attention. Recent MDA methods do not consider the pixel-level alignment between sources and target or the misalignment across different sources. In this paper, we propose a novel MDA framework to address these challenges. Specifically, we design a novel Multi-source Adversarial Domain Aggregation Network (MADAN). First, an adapted domain is generated for each source with dynamic semantic consistency while aligning towards the target at the pixel-level cycle-consistently. Second, sub-domain aggregation discriminator and cross-domain cycle discriminator are proposed to make different adapted domains more closely aggregated. Finally, feature-level alignment is performed between the aggregated domain and the target domain while training the task network. For the segmentation adaptation, we further enforce category-level alignment and incorporate multi-scale image generation, which constitutes MADAN+. We conduct extensive MDA experiments on digit recognition, object classification, and simulation-to-real semantic segmentation tasks. The results demonstrate that the proposed MADAN and MADAN+ models outperform state-of-the-art approaches by a large margin.
Databáze: OpenAIRE
Nepřihlášeným uživatelům se plný text nezobrazuje