Zobrazeno 1 - 10
of 113
pro vyhledávání: '"Tang Jiaxi"'
Publikováno v:
Open Life Sciences, Vol 18, Iss 1, Pp e2449-20 (2023)
Externí odkaz:
https://doaj.org/article/4d00166a83cb41c1a4802a54e519ab31
Autor:
Tang, Jiaxi, Drori, Yoel, Chang, Daryl, Sathiamoorthy, Maheswaran, Gilmer, Justin, Wei, Li, Yi, Xinyang, Hong, Lichan, Chi, Ed H.
Recommender systems play an important role in many content platforms. While most recommendation research is dedicated to designing better models to improve user experience, we found that research on stabilizing the training for such models is severel
Externí odkaz:
http://arxiv.org/abs/2302.09178
Recommender systems play an important role in modern information and e-commerce applications. While increasing research is dedicated to improving the relevance and diversity of the recommendations, the potential risks of state-of-the-art recommendati
Externí odkaz:
http://arxiv.org/abs/2008.04876
Knowledge Distillation (KD) is a model-agnostic technique to improve model quality while having a fixed capacity budget. It is a commonly used technique for model compression, where a larger capacity teacher model with better quality is used to train
Externí odkaz:
http://arxiv.org/abs/2002.03532
Autor:
Shih, Hai-Jung Steffi, Ai, Joyce, Abe, Justin, Tang, Jiaxi, Rowley, K. Michael, Van Dillen, Linda R., Kulig, Kornelia
Publikováno v:
In Journal of Electromyography and Kinesiology August 2023 71
Autor:
Tang, Jiaxi, Belletti, Francois, Jain, Sagar, Chen, Minmin, Beutel, Alex, Xu, Can, Chi, Ed H.
Understanding temporal dynamics has proved to be highly valuable for accurate recommendation. Sequential recommenders have been successful in modeling the dynamics of users and items over time. However, while different model architectures excel at ca
Externí odkaz:
http://arxiv.org/abs/1902.08588
Autor:
Qiu, Yueqin1,2,3,4 (AUTHOR) qiuyueqin@gdaas.cn, Tang, Jiaxi1,2,3,4 (AUTHOR) wangli1@gdaas.cn, Wang, Li1,2,3,4 (AUTHOR) jiangzongyong@gdaas.cn, Yang, Xuefen1,2,3,4 (AUTHOR) qiuyueqin@gdaas.cn, Jiang, Zongyong1,2,3,4 (AUTHOR)
Publikováno v:
International Journal of Molecular Sciences. Mar2024, Vol. 25 Issue 6, p3199. 15p.
Autor:
Tang, Jiaxi, Wang, Ke
We propose a novel way to train ranking models, such as recommender systems, that are both effective and efficient. Knowledge distillation (KD) was shown to be successful in image recognition to achieve both effectiveness and efficiency. We propose a
Externí odkaz:
http://arxiv.org/abs/1809.07428
Autor:
Tang, Jiaxi, Wang, Ke
Top-$N$ sequential recommendation models each user as a sequence of items interacted in the past and aims to predict top-$N$ ranked items that a user will likely interact in a `near future'. The order of interaction implies that sequential patterns p
Externí odkaz:
http://arxiv.org/abs/1809.07426
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.