Zobrazeno 1 - 10
of 3 335
pro vyhledávání: '"CHEN, Hongwei"'
Autor:
Lin, Gaoting, Shu, Mingfang, Zhao, Qirong, Li, Gang, Ma, Yinina, Jiao, Jinlong, Li, Yuting, Duan, Guijing, Huang, Qing, Sheng, Jieming, Kolesnikov, Alexander I., Li, Lu, Wu, Liusuo, Chen, Hongwei, Yu, Rong, Wang, Xiaoqun, Liu, Zhengxin, Zhou, Haidong, Ma, Jie
One of the most important issues in modern condensed matter physics is the realization of fractionalized excitations, such as the Majorana excitations in the Kitaev quantum spin liquid. To this aim, the 3d-based Kitaev material Na2Co2TeO6 is a promis
Externí odkaz:
http://arxiv.org/abs/2409.07959
Autor:
Zang, Yubin, Hua, Boyu, Tang, Zhenzhou, Lin, Zhipeng, Zhang, Fangzheng, Li, Simin, Zhang, Zuxing, Chen, Hongwei
In cater the need of Beyond 5G communications, large numbers of data driven artificial intelligence based fiber models has been put forward as to utilize artificial intelligence's regression ability to predict pulse evolution in fiber transmission at
Externí odkaz:
http://arxiv.org/abs/2408.09951
Autor:
Zang, Yubin, Hua, Boyu, Lin, Zhipeng, Zhang, Fangzheng, Li, Simin, Zhang, Zuxing, Chen, Hongwei
In this manuscript, a novelty principle driven fiber transmission model for short-distance transmission with parameterized inputs is put forward. By taking into the account of the previously proposed principle driven fiber model, the reduced basis ex
Externí odkaz:
http://arxiv.org/abs/2408.09947
Optical neural networks have long cast attention nowadays. Like other optical structured neural networks, fiber neural networks which utilize the mechanism of light transmission to compute can take great advantages in both computing efficiency and po
Externí odkaz:
http://arxiv.org/abs/2408.12602
Autor:
Yang, Shuo, Shang, Zirui, Wang, Yongqi, Deng, Derong, Chen, Hongwei, Cheng, Qiyuan, Wu, Xinxiao
This paper proposes a novel framework for multi-label image recognition without any training data, called data-free framework, which uses knowledge of pre-trained Large Language Model (LLM) to learn prompts to adapt pretrained Vision-Language Model (
Externí odkaz:
http://arxiv.org/abs/2403.01209
Autor:
Liang, Ming, Xie, Xiaoheng, Zhang, Gehao, Zheng, Xunjin, Di, Peng, jiang, wei, Chen, Hongwei, Wang, Chengpeng, Fan, Gang
The success of language models in code assistance has spurred the proposal of repository-level code completion as a means to enhance prediction accuracy, utilizing the context from the entire codebase. However, this amplified context can inadvertentl
Externí odkaz:
http://arxiv.org/abs/2402.14323
Autor:
Zheng, Linghan, Liu, Hui, Lin, Xiaojun, Dong, Jiayuan, Sheng, Yue, Shi, Gang, Liu, Zhiwei, Chen, Hongwei
In previous studies, code-based models have consistently outperformed text-based models in reasoning-intensive scenarios. When generating our knowledge base for Retrieval-Augmented Generation (RAG), we observed that code-based models also perform exc
Externí odkaz:
http://arxiv.org/abs/2401.10286
Autor:
Plumley, Rajan, Mardanya, Sougata, Peng, Cheng, Nokelainen, Johannes, Assefa, Tadesse, Shen, Lingjia, Burdet, Nicholas, Porter, Zach, Petsch, Alexander, Israelski, Aidan, Chen, Hongwei, Lee, Jun Sik, Morley, Sophie, Roy, Sujoy, Fabbris, Gilberto, Blackburn, Elizabeth, Feiguin, Adrian, Bansil, Arun, Lee, Wei-Sheng, Lindenberg, Aaron, Chowdhury, Sugata, Dunne, Mike, Turner, Joshua J.
Van der Waals (vdW) magnetic materials are comprised of layers of atomically thin sheets, making them ideal platforms for studying magnetism at the two-dimensional (2D) limit. These materials are at the center of a host of novel types of experiments,
Externí odkaz:
http://arxiv.org/abs/2310.07948
Autor:
Di, Peng, Li, Jianguo, Yu, Hang, Jiang, Wei, Cai, Wenting, Cao, Yang, Chen, Chaoyu, Chen, Dajun, Chen, Hongwei, Chen, Liang, Fan, Gang, Gong, Jie, Gong, Zi, Hu, Wen, Guo, Tingting, Lei, Zhichao, Li, Ting, Li, Zheng, Liang, Ming, Liao, Cong, Liu, Bingchang, Liu, Jiachen, Liu, Zhiwei, Lu, Shaojun, Shen, Min, Wang, Guangpei, Wang, Huan, Wang, Zhi, Xu, Zhaogui, Yang, Jiawei, Ye, Qing, Zhang, Gehao, Zhang, Yu, Zhao, Zelin, Zheng, Xunjin, Zhou, Hailian, Zhu, Lifu, Zhu, Xianying
Code Large Language Models (Code LLMs) have gained significant attention in the industry due to their wide applications in the full lifecycle of software engineering. However, the effectiveness of existing models in understanding non-English inputs f
Externí odkaz:
http://arxiv.org/abs/2310.06266