EmbSum: Leveraging the Summarization Capabilities of Large Language Models for Content-Based Recommendations

Autor: Zhang, Chiyu, Sun, Yifei, Wu, Minghao, Chen, Jun, Lei, Jie, Abdul-Mageed, Muhammad, Jin, Rong, Liu, Angli, Zhu, Ji, Park, Sem, Yao, Ning, Long, Bo
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Content-based recommendation systems play a crucial role in delivering personalized content to users in the digital world. In this work, we introduce EmbSum, a novel framework that enables offline pre-computations of users and candidate items while capturing the interactions within the user engagement history. By utilizing the pretrained encoder-decoder model and poly-attention layers, EmbSum derives User Poly-Embedding (UPE) and Content Poly-Embedding (CPE) to calculate relevance scores between users and candidate items. EmbSum actively learns the long user engagement histories by generating user-interest summary with supervision from large language model (LLM). The effectiveness of EmbSum is validated on two datasets from different domains, surpassing state-of-the-art (SoTA) methods with higher accuracy and fewer parameters. Additionally, the model's ability to generate summaries of user interests serves as a valuable by-product, enhancing its usefulness for personalized content recommendations.
Comment: Accepted by RecSys 2024
Databáze: arXiv