Deep multi-Wasserstein unsupervised domain adaptation

Autor: Tien-Nam Le, Marc Sebban, Amaury Habrard
Rok vydání: 2019
Předmět:
Zdroj: Pattern Recognition Letters. 125:249-255
ISSN: 0167-8655
DOI: 10.1016/j.patrec.2019.04.025
Popis: In unsupervised domain adaptation (DA), 1 aims at learning from labeled source data and fully unlabeled target examples a model with a low error on the target domain. In this setting, standard generalization bounds prompt us to minimize the sum of three terms: (a) the source true risk, (b) the divergence between the source and target domains, and (c) the combined error of the ideal joint hypothesis over the two domains. Many DA methods – especially those using deep neural networks – have focused on the first two terms by using different divergence measures to align the source and target distributions on a shared latent feature space, while ignoring the third term, assuming it is negligible to perform the adaptation. However, it has been shown that purely aligning the two distributions while minimizing the source error may lead to so-called negative transfer. In this paper, we address this issue with a new deep unsupervised DA method – called MCDA – minimizing the first two terms while controlling the third one. MCDA benefits from highly-confident target samples (using softmax predictions) to minimize class-wise Wasserstein distances and efficiently approximate the ideal joint hypothesis. Empirical results show that our approach outperforms state of the art methods.
Databáze: OpenAIRE