Zobrazeno 1 - 10
of 172
pro vyhledávání: '"Lu, Jingwen"'
Autor:
Chen, Qi, Geng, Xiubo, Rosset, Corby, Buractaon, Carolyn, Lu, Jingwen, Shen, Tao, Zhou, Kun, Xiong, Chenyan, Gong, Yeyun, Bennett, Paul, Craswell, Nick, Xie, Xing, Yang, Fan, Tower, Bryan, Rao, Nikhil, Dong, Anlei, Jiang, Wenqi, Liu, Zheng, Li, Mingqin, Liu, Chuanjie, Li, Zengzhong, Majumder, Rangan, Neville, Jennifer, Oakley, Andy, Risvik, Knut Magne, Simhadri, Harsha Vardhan, Varma, Manik, Wang, Yujing, Yang, Linjun, Yang, Mao, Zhang, Ce
Recent breakthroughs in large models have highlighted the critical significance of data scale, labels and modals. In this paper, we introduce MS MARCO Web Search, the first large-scale information-rich web dataset, featuring millions of real clicked
Externí odkaz:
http://arxiv.org/abs/2405.07526
Autor:
Sun, Hao, Liu, Xiao, Gong, Yeyun, Dong, Anlei, Lu, Jingwen, Zhang, Yan, Yang, Linjun, Majumder, Rangan, Duan, Nan
Knowledge distillation is often used to transfer knowledge from a strong teacher model to a relatively weak student model. Traditional methods include response-based methods and feature-based methods. Response-based methods are widely used but suffer
Externí odkaz:
http://arxiv.org/abs/2212.05225
Autor:
Zhou, Kun, Gong, Yeyun, Liu, Xiao, Zhao, Wayne Xin, Shen, Yelong, Dong, Anlei, Lu, Jingwen, Majumder, Rangan, Wen, Ji-Rong, Duan, Nan, Chen, Weizhu
Sampling proper negatives from a large document pool is vital to effectively train a dense retrieval model. However, existing negative sampling strategies suffer from the uninformative or false negative problem. In this work, we empirically show that
Externí odkaz:
http://arxiv.org/abs/2210.11773
Autor:
Lin, Zhenghao, Gong, Yeyun, Liu, Xiao, Zhang, Hang, Lin, Chen, Dong, Anlei, Jiao, Jian, Lu, Jingwen, Jiang, Daxin, Majumder, Rangan, Duan, Nan
Knowledge distillation is an effective way to transfer knowledge from a strong teacher to an efficient student model. Ideally, we expect the better the teacher is, the better the student. However, this expectation does not always come true. It is com
Externí odkaz:
http://arxiv.org/abs/2209.13335
Autor:
Lu, Jingwen, Tang, Chaowei, Chen, Zhengchuan, Guo, Jiayuan, Zou, Aobo, Yang, Wen, Tang, Chenxi
Publikováno v:
In Computer Networks January 2025 256
Publikováno v:
In Computers and Electrical Engineering August 2024 118 Part A
Autor:
Lu, Jingwen, Liu, Yu, Song, Miao, Xi, Yitao, Yang, Hong, Liu, Wenbo, Li, Xiao, Norvienyeku, Justice, Zhang, Yu, Miao, Weiguo, Lin, Chunhua
Publikováno v:
In Microbiological Research July 2024 284
Autor:
Pierse, Nuo Wang, Lu, Jingwen
We demonstrate that explicitly aligning the pretraining objectives to the finetuning objectives in language model training significantly improves the finetuning task performance and reduces the minimum amount of finetuning examples required. The perf
Externí odkaz:
http://arxiv.org/abs/2002.02000
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.