Zobrazeno 1 - 7
of 7
pro vyhledávání: '"Su, Hongjin"'
Autor:
Su, Hongjin, Yen, Howard, Xia, Mengzhou, Shi, Weijia, Muennighoff, Niklas, Wang, Han-yu, Liu, Haisu, Shi, Quan, Siegel, Zachary S., Tang, Michael, Sun, Ruoxi, Yoon, Jinsung, Arik, Sercan O., Chen, Danqi, Yu, Tao
Existing retrieval benchmarks primarily consist of information-seeking queries (e.g., aggregated questions from search engines) where keyword or semantic-based retrieval is usually sufficient. However, many complex real-world queries require in-depth
Externí odkaz:
http://arxiv.org/abs/2407.12883
Autor:
Su, Hongjin, Jiang, Shuyang, Lai, Yuhang, Wu, Haoyuan, Shi, Boao, Liu, Che, Liu, Qian, Yu, Tao
Recently the retrieval-augmented generation (RAG) paradigm has raised much attention for its potential in incorporating external knowledge into large language models (LLMs) without further training. While widely explored in natural language applicati
Externí odkaz:
http://arxiv.org/abs/2402.12317
Autor:
Muennighoff, Niklas, Su, Hongjin, Wang, Liang, Yang, Nan, Wei, Furu, Yu, Tao, Singh, Amanpreet, Kiela, Douwe
All text-based language problems can be reduced to either generation or embedding. Current models only perform well at one or the other. We introduce generative representational instruction tuning (GRIT) whereby a large language model is trained to h
Externí odkaz:
http://arxiv.org/abs/2402.09906
Autor:
Xie, Tianbao, Zhou, Fan, Cheng, Zhoujun, Shi, Peng, Weng, Luoxuan, Liu, Yitao, Hua, Toh Jing, Zhao, Junning, Liu, Qian, Liu, Che, Liu, Leo Z., Xu, Yiheng, Su, Hongjin, Shin, Dongchan, Xiong, Caiming, Yu, Tao
Language agents show potential in being capable of utilizing natural language for varied and intricate tasks in diverse environments, particularly when built upon large language models (LLMs). Current language agent frameworks aim to facilitate the c
Externí odkaz:
http://arxiv.org/abs/2310.10634
Autor:
Xu, Yiheng, Su, Hongjin, Xing, Chen, Mi, Boyu, Liu, Qian, Shi, Weijia, Hui, Binyuan, Zhou, Fan, Liu, Yitao, Xie, Tianbao, Cheng, Zhoujun, Zhao, Siheng, Kong, Lingpeng, Wang, Bailin, Xiong, Caiming, Yu, Tao
We introduce Lemur and Lemur-Chat, openly accessible language models optimized for both natural language and coding capabilities to serve as the backbone of versatile language agents. The evolution from language chat models to functional language age
Externí odkaz:
http://arxiv.org/abs/2310.06830
Autor:
Su, Hongjin, Shi, Weijia, Kasai, Jungo, Wang, Yizhong, Hu, Yushi, Ostendorf, Mari, Yih, Wen-tau, Smith, Noah A., Zettlemoyer, Luke, Yu, Tao
We introduce INSTRUCTOR, a new method for computing text embeddings given task instructions: every text input is embedded together with instructions explaining the use case (e.g., task and domain descriptions). Unlike encoders from prior work that ar
Externí odkaz:
http://arxiv.org/abs/2212.09741
Autor:
Su, Hongjin, Kasai, Jungo, Wu, Chen Henry, Shi, Weijia, Wang, Tianlu, Xin, Jiayi, Zhang, Rui, Ostendorf, Mari, Zettlemoyer, Luke, Smith, Noah A., Yu, Tao
Many recent approaches to natural language tasks are built on the remarkable abilities of large language models. Large language models can perform in-context learning, where they learn a new task from a few task demonstrations, without any parameter
Externí odkaz:
http://arxiv.org/abs/2209.01975