Zobrazeno 1 - 10
of 275
pro vyhledávání: '"Zeng, Qingcheng"'
Autor:
Zeng, Guangyang, Mu, Biqiang, Zeng, Qingcheng, Song, Yuchen, Dai, Chulin, Shi, Guodong, Wu, Junfeng
Camera pose estimation is a fundamental problem in robotics. This paper focuses on two issues of interest: First, point and line features have complementary advantages, and it is of great value to design a uniform algorithm that can fuse them effecti
Externí odkaz:
http://arxiv.org/abs/2407.16151
Autor:
Zeng, Qingcheng, Jin, Mingyu, Yu, Qinkai, Wang, Zhenting, Hua, Wenyue, Zhou, Zihao, Sun, Guangyan, Meng, Yanda, Ma, Shiqing, Wang, Qifan, Juefei-Xu, Felix, Ding, Kaize, Yang, Fan, Tang, Ruixiang, Zhang, Yongfeng
Large Language Models (LLMs) are employed across various high-stakes domains, where the reliability of their outputs is crucial. One commonly used method to assess the reliability of LLMs' responses is uncertainty estimation, which gauges the likelih
Externí odkaz:
http://arxiv.org/abs/2407.11282
Autor:
Jin, Mingyu, Yu, Qinkai, Huang, Jingyuan, Zeng, Qingcheng, Wang, Zhenting, Hua, Wenyue, Zhao, Haiyan, Mei, Kai, Meng, Yanda, Ding, Kaize, Yang, Fan, Du, Mengnan, Zhang, Yongfeng
Large language models (LLMs) have shown remarkable performances across a wide range of tasks. However, the mechanisms by which these models encode tasks of varying complexities remain poorly understood. In this paper, we explore the hypothesis that L
Externí odkaz:
http://arxiv.org/abs/2404.07066
KG-Rank: Enhancing Large Language Models for Medical QA with Knowledge Graphs and Ranking Techniques
Autor:
Yang, Rui, Liu, Haoran, Marrese-Taylor, Edison, Zeng, Qingcheng, Ke, Yu He, Li, Wanxin, Cheng, Lechao, Chen, Qingyu, Caverlee, James, Matsuo, Yutaka, Li, Irene
Large language models (LLMs) have demonstrated impressive generative capabilities with the potential to innovate in medicine. However, the application of LLMs in real clinical settings remains challenging due to the lack of factual consistency in the
Externí odkaz:
http://arxiv.org/abs/2403.05881
Autor:
Zeng, Guangyang, Zeng, Qingcheng, Li, Xinghan, Mu, Biqiang, Chen, Jiming, Shi, Ling, Wu, Junfeng
Given 2D point correspondences between an image pair, inferring the camera motion is a fundamental issue in the computer vision community. The existing works generally set out from the epipolar constraint and estimate the essential matrix, which is n
Externí odkaz:
http://arxiv.org/abs/2403.01174
Autor:
Li, Xinghan, Li, Haoying, Zeng, Guangyang, Zeng, Qingcheng, Ren, Xiaoqiang, Yang, Chao, Wu, Junfeng
A filter for inertial-based odometry is a recursive method used to estimate the pose from measurements of ego-motion and relative pose. Currently, there is no known filter that guarantees the computation of a globally optimal solution for the non-lin
Externí odkaz:
http://arxiv.org/abs/2402.05003
Autor:
Yang, Rui, Zeng, Qingcheng, You, Keen, Qiao, Yujie, Huang, Lucas, Hsieh, Chia-Chun, Rosand, Benjamin, Goldwasser, Jeremy, Dave, Amisha D, Keenan, Tiarnan D. L., Chew, Emily Y, Radev, Dragomir, Lu, Zhiyong, Xu, Hua, Chen, Qingyu, Li, Irene
This study introduces Ascle, a pioneering natural language processing (NLP) toolkit designed for medical text generation. Ascle is tailored for biomedical researchers and healthcare professionals with an easy-to-use, all-in-one solution that requires
Externí odkaz:
http://arxiv.org/abs/2311.16588
Autor:
Gao, Fan, Jiang, Hang, Yang, Rui, Zeng, Qingcheng, Lu, Jinghui, Blum, Moritz, Liu, Dairui, She, Tianwei, Jiang, Yuang, Li, Irene
Publikováno v:
ACL 2024 Findings
Educational materials such as survey articles in specialized fields like computer science traditionally require tremendous expert inputs and are therefore expensive to create and update. Recently, Large Language Models (LLMs) have achieved significan
Externí odkaz:
http://arxiv.org/abs/2308.10410
While a large body of literature suggests that large language models (LLMs) acquire rich linguistic representations, little is known about whether they adapt to linguistic biases in a human-like way. The present study probes this question by asking w
Externí odkaz:
http://arxiv.org/abs/2305.16917
Autor:
Zeng, Qingcheng, Garay, Lucas, Zhou, Peilin, Chong, Dading, Hua, Yining, Wu, Jiageng, Pan, Yikang, Zhou, Han, Voigt, Rob, Yang, Jie
Large pre-trained models have revolutionized natural language processing (NLP) research and applications, but high training costs and limited data resources have prevented their benefits from being shared equally amongst speakers of all the world's l
Externí odkaz:
http://arxiv.org/abs/2211.06993