Zobrazeno 1 - 10
of 5 469
pro vyhledávání: '"Sun, Feng"'
We propose a quantum Rabi square model where both the nearest-neighbor and the next-nearest-neighbor photon hopping are allowed among four quantum Rabi systems located at the vertices of a square. By tuning the next-nearest hopping strength, we reali
Externí odkaz:
http://arxiv.org/abs/2407.03612
The exotic phase transitions and multistabilities in atom-cavity coupled systems have attracted tremendous interests recently. In this work, we investigate the effect of photon hopping between two Dicke cavities, which induces rich quantum phases for
Externí odkaz:
http://arxiv.org/abs/2405.19633
Autor:
Liu, Yuxuan, Yang, Tianchi, Zhang, Zihan, Song, Minghui, Huang, Haizhen, Deng, Weiwei, Sun, Feng, Zhang, Qi
Generative retrieval, a promising new paradigm in information retrieval, employs a seq2seq model to encode document features into parameters and decode relevant document identifiers (IDs) based on search queries. Existing generative retrieval solutio
Externí odkaz:
http://arxiv.org/abs/2405.14280
Autor:
Jiang, Ting, Huang, Shaohan, Luo, Shengyue, Zhang, Zihan, Huang, Haizhen, Wei, Furu, Deng, Weiwei, Sun, Feng, Zhang, Qi, Wang, Deqing, Zhuang, Fuzhen
Low-rank adaptation is a popular parameter-efficient fine-tuning method for large language models. In this paper, we analyze the impact of low-rank updating, as implemented in LoRA. Our findings suggest that the low-rank updating mechanism may limit
Externí odkaz:
http://arxiv.org/abs/2405.12130
Autor:
Shi, Shuhua, Huang, Shaohan, Song, Minghui, Li, Zhoujun, Zhang, Zihan, Huang, Haizhen, Wei, Furu, Deng, Weiwei, Sun, Feng, Zhang, Qi
As one of the most popular parameter-efficient fine-tuning (PEFT) methods, low-rank adaptation (LoRA) is commonly applied to fine-tune large language models (LLMs). However, updating the weights of LoRA blocks effectively and expeditiously is challen
Externí odkaz:
http://arxiv.org/abs/2402.18039
Autor:
Liu, Yuxuan, Yang, Tianchi, Huang, Shaohan, Zhang, Zihan, Huang, Haizhen, Wei, Furu, Deng, Weiwei, Sun, Feng, Zhang, Qi
Large language models (LLMs) have emerged as a promising alternative to expensive human evaluations. However, the alignment and coverage of LLM-based evaluations are often limited by the scope and potential bias of the evaluation prompts and criteria
Externí odkaz:
http://arxiv.org/abs/2402.15754
Autor:
Liu, Yuxuan, Yang, Tianchi, Huang, Shaohan, Zhang, Zihan, Huang, Haizhen, Wei, Furu, Deng, Weiwei, Sun, Feng, Zhang, Qi
Diffusion models have demonstrated exceptional capability in generating high-quality images, videos, and audio. Due to their adaptiveness in iterative refinement, they provide a strong potential for achieving better non-autoregressive sequence genera
Externí odkaz:
http://arxiv.org/abs/2402.14843
Autor:
Sun, Feng, Hong, Aijun
The mobility formula based on deformation potential (DP) theory is of great importance in semiconductor physics. However, the related calculations for the DP constant are controversial. It is necessary to redo in-depth and comprehensive research on t
Externí odkaz:
http://arxiv.org/abs/2312.06954
Autor:
Wang, Zhaoyang, Huang, Shaohan, Liu, Yuxuan, Wang, Jiahai, Song, Minghui, Zhang, Zihan, Huang, Haizhen, Wei, Furu, Deng, Weiwei, Sun, Feng, Zhang, Qi
Large language models (LLMs) exhibit impressive emergent abilities in natural language processing, but their democratization is hindered due to huge computation requirements and closed-source nature. Recent research on advancing open-source smaller L
Externí odkaz:
http://arxiv.org/abs/2310.13332
Autor:
Yang, Tianchi, Song, Minghui, Zhang, Zihan, Huang, Haizhen, Deng, Weiwei, Sun, Feng, Zhang, Qi
Generative retrieval, which is a new advanced paradigm for document retrieval, has recently attracted research interests, since it encodes all documents into the model and directly generates the retrieved documents. However, its power is still underu
Externí odkaz:
http://arxiv.org/abs/2310.12455