Hybrid Learning with Teacher-student Knowledge Distillation for Recommenders

Autor: Hangbin Zhang, Victor W. Chu, Raymond K. Wong
Rok vydání: 2020
Předmět:
Zdroj: ICDM (Workshops)
DOI: 10.1109/icdmw51313.2020.00040
Popis: Latent variable models have been widely adopted by recommender systems due to the advancements of their learning scalability and performance. Recent research has focused on hybrid models. However, due to the sparsity of user and/or item data, most of these proposals have convoluted model architectures and objective functions. In particular, the latter are mostly tailored for sparse data from either user or item spaces. Although it is possible to derive an analogous model for both spaces, this makes a system overly complicated. To address this problem, we propose a deep learning based latent model called Distilled Hybrid Network (DHN) with a teacher-student learning architecture. Unlike other related work that tried to better incorporate content components to improve accuracy, we instead focus on model learning optimization. To the best of our knowledge, we are the first to employ teacher-student learning architecture for recommender systems. Experiment results show that our proposed model notably outperforms state-of-the-art approaches. We also show that our proposed architecture can be applied to existing recommender models to improve their accuracies.
Databáze: OpenAIRE