Zobrazeno 1 - 10
of 11 564
pro vyhledávání: '"JIANG, XIN"'
Autor:
Chen, Tiandao, Huang, Zhiyuan, Pan, Jinyu, Liu, Donghan, Zhao, Yinuo, He, Wenbin, Huang, Jiapeng, Jiang, Xin, Pang, Meng, Leng, Yuxin, Li, Ruxin
We demonstrate that by using 1-m-long gas-filled hollow capillary fiber (HCF) with a core diameter of 100 {\mu}m, tunable ultraviolet (UV) dispersive-wave (DW) pulses can be generated in a compact, single-stage set-up driven directly by 40-fs Ti: sap
Externí odkaz:
http://arxiv.org/abs/2412.15482
Autor:
Chen, Guoxuan, Shi, Han, Li, Jiawei, Gao, Yihang, Ren, Xiaozhe, Chen, Yimeng, Jiang, Xin, Li, Zhenguo, Liu, Weiyang, Huang, Chao
Large Language Models (LLMs) have exhibited exceptional performance across a spectrum of natural language processing tasks. However, their substantial sizes pose considerable challenges, particularly in computational demands and inference speed, due
Externí odkaz:
http://arxiv.org/abs/2412.12094
Electroencephalogram (EEG) signals have attracted significant attention from researchers due to their non-invasive nature and high temporal sensitivity in decoding visual stimuli. However, most recent studies have focused solely on the relationship b
Externí odkaz:
http://arxiv.org/abs/2412.10489
Large language models (LLMs) have made remarkable advances in recent years, with scaling laws playing a critical role in this rapid progress. In this paper, we empirically investigate how a critical hyper-parameter, i.e., the global batch size, influ
Externí odkaz:
http://arxiv.org/abs/2412.01505
We provide a concrete and computable realization of the $ER=EPR$ conjecture, by deriving the Einstein-Rosen bridge from the quantum entanglement in the thermofield double CFT. The Bekenstein-Hawking entropy of the wormhole is explicitly identified as
Externí odkaz:
http://arxiv.org/abs/2411.18485
Autor:
Huang, Minbin, Huang, Runhui, Shi, Han, Chen, Yimeng, Zheng, Chuanyang, Sun, Xiangguo, Jiang, Xin, Li, Zhenguo, Cheng, Hong
The development of Multi-modal Large Language Models (MLLMs) enhances Large Language Models (LLMs) with the ability to perceive data formats beyond text, significantly advancing a range of downstream applications, such as visual question answering an
Externí odkaz:
http://arxiv.org/abs/2411.17773
The {\it finiteness} of the mixed state entanglement entropy $S_{\text{vN}}$ in CFT$_2$ enables us to show that, the dynamical equation of the coincidence limit of $(\frac{1}{2} S_{vN}^2);_{ij}$ is precisely three dimensional Einstein equation. A pro
Externí odkaz:
http://arxiv.org/abs/2410.19711
Autor:
Wang, Zezhong, Zeng, Xingshan, Liu, Weiwen, Li, Liangyou, Wang, Yasheng, Shang, Lifeng, Jiang, Xin, Liu, Qun, Wong, Kam-Fai
Supervised fine-tuning (SFT) is a common method to enhance the tool calling capabilities of Large Language Models (LLMs), with the training data often being synthesized. The current data synthesis process generally involves sampling a set of tools, f
Externí odkaz:
http://arxiv.org/abs/2410.18447
Autor:
Li, Qintong, Gao, Jiahui, Wang, Sheng, Pi, Renjie, Zhao, Xueliang, Wu, Chuan, Jiang, Xin, Li, Zhenguo, Kong, Lingpeng
Large language models (LLMs) have significantly benefited from training on diverse, high-quality task-specific data, leading to impressive performance across a range of downstream applications. Current methods often rely on human-annotated data or pr
Externí odkaz:
http://arxiv.org/abs/2410.16736
Autor:
Ye, Jiacheng, Gao, Jiahui, Gong, Shansan, Zheng, Lin, Jiang, Xin, Li, Zhenguo, Kong, Lingpeng
Autoregressive language models, despite their impressive capabilities, struggle with complex reasoning and long-term planning tasks. We introduce discrete diffusion models as a novel solution to these challenges. Through the lens of subgoal imbalance
Externí odkaz:
http://arxiv.org/abs/2410.14157