Autor: |
Ren, Yimeng, Liang, Kun, Shang, Yuhu, Zhang, Xiankun |
Předmět: |
|
Zdroj: |
Complex & Intelligent Systems; Apr2023, Vol. 9 Issue 2, p2159-2176, 18p |
Abstrakt: |
Top-N recommendation has received great attention in assisting students in providing personalized learning guidance on the required subject/domain. Generally, existing approaches mainly aim to maximize the overall accuracy of the recommendation list while ignoring the accuracy of highly ranked recommended exercises, which seriously affects the students' learning enthusiasm. Motivated by the Knowledge Distillation (KD) technique, we skillfully design a fully adaptive recommendation paradigm named Top-enhanced Recommender Distillation framework (TERD) to improve the recommendation effect of the top positions. Specifically, the proposed TERD transfers the knowledge of an arbitrary recommender (teacher network), and injects it into a well-designed student network. The prior knowledge provided by the teacher network, including student-exercise embeddings, and candidate exercise subsets, are further utilized to define the state and action space of the student network (i.e., DDQN). In addition, the student network introduces a well-designed state representation scheme and an effective individual ability tracing model to enhance the recommendation accuracy of top positions. The developed TERD follows a flexible model-agnostic paradigm that not only simplifies the action space of the student network, but also promotes the recommendation accuracy of the top position, thus enhancing the students' motivation and engagement in e-learning environment. We implement our proposed approach on three well-established datasets and evaluate its Top-enhanced performance. The experimental evaluation on three publicly available datasets shows that our proposed TERD scheme effectively resolves the Top-enhanced recommendation issue. [ABSTRACT FROM AUTHOR] |
Databáze: |
Complementary Index |
Externí odkaz: |
|