DA-Ada: Learning Domain-Aware Adapter for Domain Adaptive Object Detection

Autor: Li, Haochen, Zhang, Rui, Yao, Hantao, Zhang, Xin, Hao, Yifan, Song, Xinkai, Li, Xiaqing, Zhao, Yongwei, Li, Ling, Chen, Yunji
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Domain adaptive object detection (DAOD) aims to generalize detectors trained on an annotated source domain to an unlabelled target domain. As the visual-language models (VLMs) can provide essential general knowledge on unseen images, freezing the visual encoder and inserting a domain-agnostic adapter can learn domain-invariant knowledge for DAOD. However, the domain-agnostic adapter is inevitably biased to the source domain. It discards some beneficial knowledge discriminative on the unlabelled domain, i.e., domain-specific knowledge of the target domain. To solve the issue, we propose a novel Domain-Aware Adapter (DA-Ada) tailored for the DAOD task. The key point is exploiting domain-specific knowledge between the essential general knowledge and domain-invariant knowledge. DA-Ada consists of the Domain-Invariant Adapter (DIA) for learning domain-invariant knowledge and the Domain-Specific Adapter (DSA) for injecting the domain-specific knowledge from the information discarded by the visual encoder. Comprehensive experiments over multiple DAOD tasks show that DA-Ada can efficiently infer a domain-aware visual encoder for boosting domain adaptive object detection. Our code is available at https://github.com/Therock90421/DA-Ada.
Comment: Accepted by NeurIPS 2024
Databáze: arXiv