Autor: |
Qin, Chaoyong, Xie, Jialin, Jiang, Qiuxian, Chen, Xin |
Předmět: |
|
Zdroj: |
Neural Computing & Applications; Dec2023, Vol. 35 Issue 35, p24665-24680, 16p |
Abstrakt: |
Efficiently extracting user interest from user behavior sequences is the key to improving the click-through rate, and learning sophisticated feature interaction information is also critical in maximizing CTR. However, in terms of interest extraction, the problem of sequence dependence in most existing methods renders low training efficiency. Meanwhile, when exploring high-order feature interactions, the existing method fails to exploit information from all layers of the model. In this study, we propose an interest evolution network (TGRIEN) based on transformer and a gated residual. First, a transformer network supervised by an auxiliary loss function is proposed to extract users' interests from behavioral sequences in parallel to enhance the training efficiency. Second, a minimal gated unit with an attention forget gate is constructed to detect interests related to target ads and capture the evolution of users' interests. A gating mechanism is also employed in the residual module to construct a skip gated residual network, which can realize more abundant and effective feature interaction information in several ways. We evaluate the performance of TGRIEN on two real-world datasets. Experimental results demonstrate that our model significantly outperforms state-of-the-art baselines in terms of both prediction and training efficiency. [ABSTRACT FROM AUTHOR] |
Databáze: |
Complementary Index |
Externí odkaz: |
|
Nepřihlášeným uživatelům se plný text nezobrazuje |
K zobrazení výsledku je třeba se přihlásit.
|