An Efficient Transfer Learning Method with Auxiliary Information.

Autor: BO LIU, LIANGJIAO LI, YANSHAN XIAO, KAI WANG, JIAN HU, JUNRUI LIU, RUIGUANG HUANG
Předmět:
Zdroj: ACM Transactions on Knowledge Discovery from Data; Jan2024, Vol. 18 Issue 1, p1-23, 23p
Abstrakt: Transfer learning (TL) is an information reuse learning tool, which can help us learn better classification effect than traditional single task learning, because transfer learning can share information within the task-to-task model. Most TL algorithms are studied in the field of data improvement, doing some data extraction and transformation. However, it ignores that existing the additional information to improve the model's accuracy, like Universum samples in the training data with privileged information. In this article, we focus on considering prior data to improve the TL algorithm, and the additional features also called privileged information are incorporated into the learning to improve the learning paradigm. In addition, we also carry out the Universum samples which do not belong to any indicated categories into the transfer learning paradigm to improve the utilization of prior knowledge. We propose a new TL Model (PU-TLSVM), in which each task with corresponding privileged features and Universum data is considered in the proposed model, so as to apply tasks with a priori data to the training stage. Then, we use Lagrange duality theorem to optimize our model to obtain the optimal discriminant for target task classification. Finally, we make a lot of predictions and tests to compare the actual effectiveness of the proposed method with the previous methods. The experiment results indicate that the proposed method is more effective and robust than other baselines. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index