Zobrazeno 1 - 10
of 24
pro vyhledávání: '"Chen, Longze"'
Multi-Object Tracking (MOT) aims to associate multiple objects across video frames and is a challenging vision task due to inherent complexities in the tracking environment. Most existing approaches train and track within a single domain, resulting i
Externí odkaz:
http://arxiv.org/abs/2410.23907
Autor:
Luo, Jing, Luo, Run, Chen, Longze, Zhu, Liang, Ao, Chang, Li, Jiaming, Chen, Yukun, Cheng, Xin, Yang, Wen, Su, Jiayuan, Li, Chengming, Yang, Min
While closed-source Large Language Models (LLMs) demonstrate strong mathematical problem-solving abilities, open-source models continue to struggle with such tasks. To bridge this gap, we propose a data augmentation approach and introduce PersonaMath
Externí odkaz:
http://arxiv.org/abs/2410.01504
Autor:
Li, Jiaming, Zhang, Lei, Li, Yunshui, Liu, Ziqiang, bai, yuelin, Luo, Run, Chen, Longze, Yang, Min
The instruction-following ability of large language models enables humans to interact with AI agents in a natural way. However, when required to generate responses of a specific length, large language models often struggle to meet users' needs due to
Externí odkaz:
http://arxiv.org/abs/2409.18943
In the era of large language models (LLMs), a vast amount of conversation logs will be accumulated thanks to the rapid development trend of language UI. Conversation Analysis (CA) strives to uncover and analyze critical information from conversation
Externí odkaz:
http://arxiv.org/abs/2409.14195
Autor:
Luo, Run, Zhang, Haonan, Chen, Longze, Lin, Ting-En, Liu, Xiong, Wu, Yuchuan, Yang, Min, Wang, Minzheng, Zeng, Pengpeng, Gao, Lianli, Shen, Heng Tao, Li, Yunshui, Xia, Xiaobo, Huang, Fei, Song, Jingkuan, Li, Yongbin
The development of Multimodal Large Language Models (MLLMs) has seen significant advancements with increasing demands in various fields (e.g., multimodal agents, embodied intelligence). While model-driven approaches attempt to enhance MLLMs capabilit
Externí odkaz:
http://arxiv.org/abs/2409.05840
Autor:
Zhang, Lei, Li, Yunshui, Li, Jiaming, Xia, Xiaobo, Yang, Jiaxi, Luo, Run, Wang, Minzheng, Chen, Longze, Liu, Junhao, Yang, Min
Some recently developed code large language models (Code LLMs) have been pre-trained on repository-level code data (Repo-Code LLMs), enabling these models to recognize repository structures and utilize cross-file information for code completion. Howe
Externí odkaz:
http://arxiv.org/abs/2406.18294
Autor:
Wang, Minzheng, Chen, Longze, Fu, Cheng, Liao, Shengyi, Zhang, Xinghua, Wu, Bingli, Yu, Haiyang, Xu, Nan, Zhang, Lei, Luo, Run, Li, Yunshui, Yang, Min, Huang, Fei, Li, Yongbin
Long-context modeling capabilities have garnered widespread attention, leading to the emergence of Large Language Models (LLMs) with ultra-context windows. Meanwhile, benchmarks for evaluating long-context LLMs are gradually catching up. However, exi
Externí odkaz:
http://arxiv.org/abs/2406.17419
Long-context modeling capabilities are important for large language models (LLMs) in various applications. However, directly training LLMs with long context windows is insufficient to enhance this capability since some training samples do not exhibit
Externí odkaz:
http://arxiv.org/abs/2405.17915
Autor:
Luo, Run, Li, Yunshui, Chen, Longze, He, Wanwei, Lin, Ting-En, Liu, Ziqiang, Zhang, Lei, Song, Zikai, Xia, Xiaobo, Liu, Tongliang, Yang, Min, Hui, Binyuan
The development of large language models (LLMs) has significantly advanced the emergence of large multimodal models (LMMs). While LMMs have achieved tremendous success by promoting the synergy between multimodal comprehension and creation, they often
Externí odkaz:
http://arxiv.org/abs/2405.15232
Autor:
Zhang, Lei, Li, Yunshui, Liu, Ziqiang, yang, Jiaxi, Liu, Junhao, Chen, Longze, Luo, Run, Yang, Min
With the advancement of large language models (LLMs) and the expansion of their context windows, existing long-context benchmarks fall short in effectively evaluating the models' comprehension and reasoning abilities in extended texts. Moreover, conv
Externí odkaz:
http://arxiv.org/abs/2312.09542