Zobrazeno 1 - 10
of 1 815
pro vyhledávání: '"Guo, Pei"'
Prompt optimization algorithms for Large Language Models (LLMs) excel in multi-step reasoning but still lack effective uncertainty estimation. This paper introduces a benchmark dataset to evaluate uncertainty metrics, focusing on Answer, Correctness,
Externí odkaz:
http://arxiv.org/abs/2409.10044
Autor:
Chen, Xia, Luo, Qin-Yue, Guo, Pei-Jie, Zhou, Hao-Jie, Hu, Qi-Cheng, Wu, Hong-Peng, Shen, Xiao-Wen, Cui, Ru-Yue, Dong, Lei, Wei, Tian-Xing, Xiao, Yu-Hang, Li, De-Ren, Lei, Li, Zhang, Xi, Wang, Jun-Feng, Xiang, Gang
Room-temperature (RT) two-dimensional (2D) van der Waals (vdW) ferromagnets hold immense promise for next-generation spintronic devices for information storage and processing. To achieve high-density energy-efficient spintronic devices, it is essenti
Externí odkaz:
http://arxiv.org/abs/2406.02346
Autor:
Qiao, Dan, Su, Yi, Wang, Pinzheng, Ye, Jing, Xie, Wenjing, Zhou, Yuechi, Ding, Yuyang, Tang, Zecheng, Wang, Jikai, Ji, Yixin, Wang, Yue, Guo, Pei, Sun, Zechen, Zhang, Zikang, Li, Juntao, Chao, Pingfu, Chen, Wenliang, Fu, Guohong, Zhou, Guodong, Zhu, Qiaoming, Zhang, Min
Large Language Models (LLMs) have played an important role in many fields due to their powerful capabilities.However, their massive number of parameters leads to high deployment requirements and incurs significant inference costs, which impedes their
Externí odkaz:
http://arxiv.org/abs/2405.05957
This research paper addresses the challenge of modality mismatch in multimodal learning, where the modalities available during inference differ from those available at training. We propose the Text-centric Alignment for Multi-Modality Learning (TAMML
Externí odkaz:
http://arxiv.org/abs/2402.08086
In this work, we conduct an assessment of the optimization capabilities of LLMs across various tasks and data sizes. Each of these tasks corresponds to unique optimization domains, and LLMs are required to execute these tasks with interactive prompti
Externí odkaz:
http://arxiv.org/abs/2310.05204
Autor:
Li, Juntao, Tang, Zecheng, Ding, Yuyang, Wang, Pinzheng, Guo, Pei, You, Wangjie, Qiao, Dan, Chen, Wenliang, Fu, Guohong, Zhu, Qiaoming, Zhou, Guodong, Zhang, Min
Large language models (LLMs) with billions of parameters have demonstrated outstanding performance on various natural language processing tasks. This report presents OpenBA, an open-sourced 15B bilingual asymmetric seq2seq model, to contribute an LLM
Externí odkaz:
http://arxiv.org/abs/2309.10706
Machine learning (ML) models have been quite successful in predicting outcomes in many applications. However, in some cases, domain experts might have a judgment about the expected outcome that might conflict with the prediction of ML models. One mai
Externí odkaz:
http://arxiv.org/abs/2304.11870
Non-autoregressive neural machine translation (NAT) models are proposed to accelerate the inference process while maintaining relatively high performance. However, existing NAT models are difficult to achieve the desired efficiency-quality trade-off.
Externí odkaz:
http://arxiv.org/abs/2303.07665
Autor:
Tian, Baojiang1 (AUTHOR) 45464123tbj@sina.cn, Guo, Pei2 (AUTHOR) gp_0371@163.com, Du, Xingwei1 (AUTHOR) dxw1575@126.com, Liao, Xiaoyu1 (AUTHOR) lxy03711@126.com, Xiao, Chao2 (AUTHOR), Dong, Yiran3 (AUTHOR) 202211021168t@stu.cqu.edu.cn, Wang, Jingang3 (AUTHOR) jingang_023@163.com
Publikováno v:
Energies (19961073). Nov2024, Vol. 17 Issue 21, p5495. 12p.
Publikováno v:
Lipids in Health and Disease, Vol 23, Iss 1, Pp 1-3 (2024)
Externí odkaz:
https://doaj.org/article/ce8a2a8d435340e88829ec67690b95b2