Hybrid Learning with Teacher-student Knowledge Distillation for Recommenders
Autor: | Hangbin Zhang, Victor W. Chu, Raymond K. Wong |
---|---|
Rok vydání: | 2020 |
Předmět: |
Linear programming
Computer science business.industry Deep learning Knowledge engineering 02 engineering and technology Latent variable Recommender system Machine learning computer.software_genre 020204 information systems 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Artificial intelligence business computer Sparse matrix |
Zdroj: | ICDM (Workshops) |
DOI: | 10.1109/icdmw51313.2020.00040 |
Popis: | Latent variable models have been widely adopted by recommender systems due to the advancements of their learning scalability and performance. Recent research has focused on hybrid models. However, due to the sparsity of user and/or item data, most of these proposals have convoluted model architectures and objective functions. In particular, the latter are mostly tailored for sparse data from either user or item spaces. Although it is possible to derive an analogous model for both spaces, this makes a system overly complicated. To address this problem, we propose a deep learning based latent model called Distilled Hybrid Network (DHN) with a teacher-student learning architecture. Unlike other related work that tried to better incorporate content components to improve accuracy, we instead focus on model learning optimization. To the best of our knowledge, we are the first to employ teacher-student learning architecture for recommender systems. Experiment results show that our proposed model notably outperforms state-of-the-art approaches. We also show that our proposed architecture can be applied to existing recommender models to improve their accuracies. |
Databáze: | OpenAIRE |
Externí odkaz: |