Zobrazeno 1 - 10
of 1 401
pro vyhledávání: '"Zheni A"'
Publikováno v:
In China Geology June 2021 4(2):197-204
Autor:
Zeng, Zheni, Chen, Yuxuan, Yu, Shi, Wang, Ruobing, Yan, Yukun, Liu, Zhenghao, Wang, Shuo, Han, Xu, Liu, Zhiyuan, Sun, Maosong
Humans can utilize techniques to quickly acquire knowledge from specific materials in advance, such as creating self-assessment questions, enabling us to achieving related tasks more efficiently. In contrast, large language models (LLMs) usually reli
Externí odkaz:
http://arxiv.org/abs/2411.14790
Publikováno v:
China Geology, Vol 4, Iss 2, Pp 197-204 (2021)
The authors reassessed the taxonomic distinction of Iteravis huchzermeyeri and Gansus zheni, which are two species of Ornithuromorpha based on specimens from the same locality in western Liaoning and derive from the Jehol Biota. The detailed comparis
Externí odkaz:
https://doaj.org/article/bb03cfb97fef4a1eb437fd496fff0a8d
Autor:
Li, Xinze, Mei, Sen, Liu, Zhenghao, Yan, Yukun, Wang, Shuo, Yu, Shi, Zeng, Zheni, Chen, Hao, Yu, Ge, Liu, Zhiyuan, Sun, Maosong, Xiong, Chenyan
Retrieval-Augmented Generation (RAG) has proven its effectiveness in mitigating hallucinations in Large Language Models (LLMs) by retrieving knowledge from external resources. To adapt LLMs for RAG pipelines, current approaches use instruction tuning
Externí odkaz:
http://arxiv.org/abs/2410.13509
Autor:
Zeng, Zheni, Chen, Jiayi, Chen, Huimin, Yan, Yukun, Chen, Yuxuan, Liu, Zhenghao, Liu, Zhiyuan, Sun, Maosong
Large language models exhibit aspects of human-level intelligence that catalyze their application as human-like agents in domains such as social simulations, human-machine interactions, and collaborative multi-agent systems. However, the absence of d
Externí odkaz:
http://arxiv.org/abs/2407.12393
Autor:
Xu, Zhipeng, Liu, Zhenghao, Yan, Yukun, Wang, Shuo, Yu, Shi, Zeng, Zheni, Xiao, Chaojun, Liu, Zhiyuan, Yu, Ge, Xiong, Chenyan
Retrieval-Augmented Generation (RAG) enables Large Language Models (LLMs) to leverage external knowledge, enhancing their performance on knowledge-intensive tasks. However, existing RAG models often treat LLMs as passive recipients of information, wh
Externí odkaz:
http://arxiv.org/abs/2402.13547
Autor:
Wang, Zheni1 (AUTHOR) steveecarroll82@gmail.com, Carroll, Steve1 (AUTHOR), Wang, Eric H.2 (AUTHOR) eriwang@ucdavis.edu
Publikováno v:
Behavioral Sciences (2076-328X). Nov2024, Vol. 14 Issue 11, p1014. 11p.
Autor:
Song, Chenyang, Han, Xu, Zeng, Zheni, Li, Kuai, Chen, Chen, Liu, Zhiyuan, Sun, Maosong, Yang, Tao
Continual learning necessitates the continual adaptation of models to newly emerging tasks while minimizing the catastrophic forgetting of old ones. This is extremely challenging for large language models (LLMs) with vanilla full-parameter tuning due
Externí odkaz:
http://arxiv.org/abs/2309.14763
Publikováno v:
Acta Scientifica Naturalis. Mar2024, Vol. 11 Issue 1, p30-39. 10p.
Autor:
Zeng, Zheni, Yin, Bangchen, Wang, Shipeng, Liu, Jiarui, Yang, Cheng, Yao, Haishen, Sun, Xingzhi, Sun, Maosong, Xie, Guotong, Liu, Zhiyuan
Natural language is expected to be a key medium for various human-machine interactions in the era of large language models. When it comes to the biochemistry field, a series of tasks around molecules (e.g., property prediction, molecule mining, etc.)
Externí odkaz:
http://arxiv.org/abs/2306.11976