Zobrazeno 1 - 10
of 7 269
pro vyhledávání: '"Su,Jing"'
Autor:
Zhao, Jingjing, Su, Jing, Lv, Xianchi, Cai, Kaiquan, Zhu, Yanbo, Liu, Yuanwei, Al-Dhahir, Naofal
A novel time-efficient framework is proposed for improving the robustness of a broadband multiple-input multiple-output (MIMO) system against unknown interference under rapidly-varying channels. A mean-squared error (MSE) minimization problem is form
Externí odkaz:
http://arxiv.org/abs/2412.19221
Autor:
Chen, Jiawei, Chen, Wentao, Su, Jing, Xu, Jingjing, Lin, Hongyu, Ren, Mengjie, Lu, Yaojie, Han, Xianpei, Sun, Le
Large language models (LLMs) have shown significant multilingual capabilities. However, the mechanisms underlying the development of these capabilities during pre-training are not well understood. In this paper, we use code LLMs as an experimental pl
Externí odkaz:
http://arxiv.org/abs/2412.07298
Autor:
Bytedance-Seed-Foundation-Code-Team, Cheng, Yao, Chen, Jianfeng, Chen, Jie, Chen, Li, Chen, Liyu, Chen, Wentao, Chen, Zhengyu, Geng, Shijie, Li, Aoyan, Li, Bo, Li, Bowen, Li, Linyi, Liu, Boyi, Liu, Jerry, Liu, Kaibo, Liu, Qi, Liu, Shukai, Liu, Siyao, Liu, Tianyi, Liu, Tingkai, Liu, Yongfei, Long, Rui, Mai, Jing, Ning, Guanghan, Peng, Z. Y., Shen, Kai, Su, Jiahao, Su, Jing, Sun, Tao, Sun, Yifan, Tao, Yunzhe, Wang, Guoyin, Wang, Siwei, Wang, Xuwu, Wang, Yite, Wang, Zihan, Xia, Jinxiang, Xiang, Liang, Xiao, Xia, Xiao, Yongsheng, Xi, Chenguang, Xin, Shulin, Xu, Jingjing, Xu, Shikun, Yang, Hongxia, Yang, Jack, Yang, Yingxiang, Yuan, Jianbo, Zhang, Jun, Zhang, Yufeng, Zhang, Yuyu, Zheng, Shen, Zhu, He, Zhu, Ming
As the capabilities of code large language models (LLMs) continue to expand, their applications across diverse code intelligence domains are rapidly increasing. However, most existing datasets only evaluate limited application domains. To address thi
Externí odkaz:
http://arxiv.org/abs/2412.00535
Stereo matching for inland waterways is one of the key technologies for the autonomous navigation of Unmanned Surface Vehicles (USVs), which involves dividing the stereo images into reference images and target images for pixel-level matching. However
Externí odkaz:
http://arxiv.org/abs/2410.07915
Autor:
Zhang, Kechi, Li, Ge, Dong, Yihong, Xu, Jingjing, Zhang, Jun, Su, Jing, Liu, Yongfei, Jin, Zhi
Code generation models have shown significant potential for programming tasks. However, existing training methods like supervised fine-tuning face key limitations: they do not effectively teach models to prioritize correct over incorrect solutions in
Externí odkaz:
http://arxiv.org/abs/2410.05605
Autor:
Dong, Yihong, Li, Ge, Tao, Yongding, Jiang, Xue, Zhang, Kechi, Li, Jia, Su, Jing, Zhang, Jun, Xu, Jingjing
Despite the remarkable success achieved by neural networks, particularly those represented by MLP and Transformer, we reveal that they exhibit potential flaws in the modeling and reasoning of periodicity, i.e., they tend to memorize the periodic data
Externí odkaz:
http://arxiv.org/abs/2410.02675
Autor:
Wang, Xuwu, Cui, Qiwen, Tao, Yunzhe, Wang, Yiran, Chai, Ziwei, Han, Xiaotian, Liu, Boyi, Yuan, Jianbo, Su, Jing, Wang, Guoyin, Liu, Tingkai, Chen, Liyu, Liu, Tianyi, Sun, Tao, Zhang, Yufeng, Zheng, Sirui, You, Quanzeng, Yang, Yang, Yang, Hongxia
Large language models (LLMs) have become increasingly pivotal across various domains, especially in handling complex data types. This includes structured data processing, as exemplified by ChartQA and ChatGPT-Ada, and multimodal unstructured data pro
Externí odkaz:
http://arxiv.org/abs/2410.00773
Low-Rank Adaptation (LoRA) has emerged as a popular technique for fine-tuning large language models (LLMs) to various domains due to its modular design and widespread availability on platforms like Huggingface. This modularity has sparked interest in
Externí odkaz:
http://arxiv.org/abs/2409.16167
Micro-expressions (MEs) are brief, subtle facial expressions that reveal concealed emotions, offering key behavioral cues for social interaction. Characterized by short duration, low intensity, and spontaneity, MEs have been mostly studied through su
Externí odkaz:
http://arxiv.org/abs/2409.00017
Autor:
Zhou, Zhifan, de Araujo, Luís E. E., Dimario, Matt, Zhao, Jie, Su, Jing, Wu, Meng-Chang, Anderson, B. E., Jones, Kevin M., Lett, Paul D.
Entangled graph states can be used for quantum sensing and computing applications. Error correction in measurement-based quantum computing schemes will require the construction of cluster states in at least 3 dimensions. Here we generate 1-, 2-, 3-,
Externí odkaz:
http://arxiv.org/abs/2408.06317