Zobrazeno 1 - 10
of 8 821
pro vyhledávání: '"wu, Xing"'
In this paper, we propose a novel method of using the characteristic operator (CO) ${\cal \hat{D}}_{n_{\gamma},n_{\beta}}$ to formalize the principle of maximum conformality (PMC) procedures. Using the CO formulism, we are able to facilitate the deri
Externí odkaz:
http://arxiv.org/abs/2411.15402
In the paper, we conduct a detailed investigation of the rare decay processes of charged meson, specifically $B^+ \to K^+\ell^+\ell^-$ with $\ell=(e,\mu,\tau)$ and $B^+ \to K^+\nu\bar{\nu}$. These processes involve flavor-changing-neutral-current (FC
Externí odkaz:
http://arxiv.org/abs/2411.12141
Autor:
Su, Zhenpeng, Wu, Xing, Lin, Zijia, Xiong, Yizhe, Lv, Minxuan, Ma, Guangyuan, Chen, Hui, Hu, Songlin, Ding, Guiguang
Large language models (LLM) have been attracting much attention from the community recently, due to their remarkable performance in all kinds of downstream tasks. According to the well-known scaling law, scaling up a dense LLM enhances its capabiliti
Externí odkaz:
http://arxiv.org/abs/2410.16077
Publikováno v:
Eur.Phys.J.C 84 (2024) 11, 1216
In an extension of MSSM with two triplets and a singlet, called the TNMSSM, there are seven neutralinos which can enrich the study of cold dark matter if one expects that the weakly interacting massive particle (WIMP) is responsible for the observati
Externí odkaz:
http://arxiv.org/abs/2410.13659
In this paper, we explore the properties of the Ellis-Jaffe Sum Rule (EJSR) in detail, employing the Principle of Maximum Conformality (PMC) approach to address its perturbative QCD contribution including next-to-next-to-next-to-leading order ($\rm N
Externí odkaz:
http://arxiv.org/abs/2410.06956
Large language models (LLMs) have made significant progress in natural language understanding and generation, driven by scalable pretraining and advanced finetuning. However, enhancing reasoning abilities in LLMs, particularly via reinforcement learn
Externí odkaz:
http://arxiv.org/abs/2410.02229
Large Language Model-based Dense Retrieval (LLM-DR) optimizes over numerous heterogeneous fine-tuning collections from different domains. However, the discussion about its training data distribution is still minimal. Previous studies rely on empirica
Externí odkaz:
http://arxiv.org/abs/2408.10613
Publikováno v:
Phys. Rev. D 110, 114010 (2024)
In this paper, we compute the total and differential cross sections for $e^+e^- \to J/\psi+c+\bar{c}$ at the $B$ factories up to next-to-leading order (NLO) corrections within the framework of nonrelativistic QCD factorization theory. We then obtain
Externí odkaz:
http://arxiv.org/abs/2407.14150
Autor:
Su, Zhenpeng, Lin, Zijia, Bai, Xue, Wu, Xing, Xiong, Yizhe, Lian, Haoran, Ma, Guangyuan, Chen, Hui, Ding, Guiguang, Zhou, Wei, Hu, Songlin
Scaling the size of a model enhances its capabilities but significantly increases computation complexity. Mixture-of-Experts models (MoE) address the issue by allowing model size to scale up without substantially increasing training or inference cost
Externí odkaz:
http://arxiv.org/abs/2407.09816
Autor:
Di Giustino, Leonardo, Brodsky, Stanley J., Ratcliffe, Philip G., Wang, Sheng-Quan, Wu, Xing-Gang
We present a new approach to determining the strong coupling $\alpha_s(Q)$, over the entire range of validity of perturbative QCD, for scales above $\Lambda_{\mathrm{QCD}}$ and up to the Planck scale $\sim1.22\cdot10^{19}$\,GeV, with the highest prec
Externí odkaz:
http://arxiv.org/abs/2407.08570