Multi-Domain Long-Tailed Learning: Challenges, Progress, and Prospects

Autor: Panpan Fu, Umi Kalsom Yusof
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: IEEE Access, Vol 12, Pp 129528-129540 (2024)
Druh dokumentu: article
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2024.3413578
Popis: In practical applications, the issue of data imbalance inevitably rises. In most studies, the predominant focus regarding long-tailed class imbalance pertains to a setting within a single domain in which the training and test samples are presumed to originate from the same feature space and possess identical data distributions. However, natural datasets can be derived from distinct domains in which minority classes in a specific domain can be majority classes in other domains. Multi-domain long-tailed learning is the process of acquiring knowledge from imbalanced datasets spanning numerous domains, ensuring that the learned model can generalize to all classes across all domains. This study offers a comprehensive review of existing multi-domain long-tailed learning methods that includes challenges, advances in research, and prospects. Our study first defines multi-domain long-tailed learning and its associated challenges. Then, an overall categorization of existing methods is introduced, and an overview of these research advancements is provided.
Databáze: Directory of Open Access Journals