Zobrazeno 1 - 10
of 376
pro vyhledávání: '"Li, Xinze"'
Autor:
Yang, Weiqing, Wang, Hanbin, Liu, Zhenghao, Li, Xinze, Yan, Yukun, Wang, Shuo, Gu, Yu, Yu, Minghe, Liu, Zhiyuan, Yu, Ge
Debugging is a vital aspect of software development, yet the debugging capabilities of Large Language Models (LLMs) remain largely unexplored. This paper first introduces DEBUGEVAL, a comprehensive benchmark designed to evaluate the debugging capabil
Externí odkaz:
http://arxiv.org/abs/2408.05006
Since ancient times, mechanical design aids have been developed to assist human users, aimed at improving the efficiency and effectiveness of design. However, even with the widespread use of contemporary Computer-Aided Design (CAD) systems, there are
Externí odkaz:
http://arxiv.org/abs/2408.02087
Autor:
Ma, Yubo, Zang, Yuhang, Chen, Liangyu, Chen, Meiqi, Jiao, Yizhu, Li, Xinze, Lu, Xinyuan, Liu, Ziyu, Ma, Yan, Dong, Xiaoyi, Zhang, Pan, Pan, Liangming, Jiang, Yu-Gang, Wang, Jiaqi, Cao, Yixin, Sun, Aixin
Understanding documents with rich layouts and multi-modal components is a long-standing and practical task. Recent Large Vision-Language Models (LVLMs) have made remarkable strides in various tasks, particularly in single-page document understanding
Externí odkaz:
http://arxiv.org/abs/2407.01523
Autor:
Li, Xinze
This note is based on Professor Vitali Kapovitch's comparison geometry course at the University of Toronto. It delves into various comparison theorems, including those by Rauch and Toponogov, focusing on their applications, such as Bishop-Gromov volu
Externí odkaz:
http://arxiv.org/abs/2404.09792
Structure-based drug design (SBDD), which aims to generate molecules that can bind tightly to the target protein, is an essential problem in drug discovery, and previous approaches have achieved initial success. However, most existing methods still s
Externí odkaz:
http://arxiv.org/abs/2404.02003
This paper proposes PE-GPT, a custom-tailored large language model uniquely adapted for power converter modulation design. By harnessing in-context learning and specialized tiered physics-informed neural networks, PE-GPT guides users through text-bas
Externí odkaz:
http://arxiv.org/abs/2403.14059
Autor:
Ding, Bosheng, Qin, Chengwei, Zhao, Ruochen, Luo, Tianze, Li, Xinze, Chen, Guizhen, Xia, Wenhan, Hu, Junjie, Luu, Anh Tuan, Joty, Shafiq
In the rapidly evolving field of large language models (LLMs), data augmentation (DA) has emerged as a pivotal technique for enhancing model performance by diversifying training examples without the need for additional data collection. This survey ex
Externí odkaz:
http://arxiv.org/abs/2403.02990
Large language models (LLMs) require lengthy prompts as the input context to produce output aligned with user intentions, a process that incurs extra costs during inference. In this paper, we propose the Gist COnditioned deCOding (Gist-COCO) model, i
Externí odkaz:
http://arxiv.org/abs/2402.16058
Autor:
Zhou, Tianshuo, Mei, Sen, Li, Xinze, Liu, Zhenghao, Xiong, Chenyan, Liu, Zhiyuan, Gu, Yu, Yu, Ge
This paper proposes Multi-modAl Retrieval model via Visual modulE pLugin (MARVEL), which learns an embedding space for queries and multi-modal documents to conduct retrieval. MARVEL encodes queries and multi-modal documents with a unified encoder mod
Externí odkaz:
http://arxiv.org/abs/2310.14037
Although achieving great success, Large Language Models (LLMs) usually suffer from unreliable hallucinations. Although language attribution can be a potential solution, there are no suitable benchmarks and evaluation metrics to attribute LLMs to stru
Externí odkaz:
http://arxiv.org/abs/2310.05634