Zobrazeno 1 - 10
of 3 870
pro vyhledávání: '"Lv Xin"'
Autor:
Qiu Jun, Wang Haoyun, Lv Xin, Mao Lipeng, Huang Junyan, Hao Tao, Li Junliang, Qi Shuo, Chen Guodong, Jiang Haiping
Publikováno v:
Open Life Sciences, Vol 18, Iss 1, Pp 589-604 (2023)
The aim of this study is to explore a novel classification and investigate the clinical significance of hepatocellular carcinoma (HCC) cells. We analyzed integrated single-cell RNA sequencing and bulk RNA-seq data obtained from HCC samples. Cell traj
Externí odkaz:
https://doaj.org/article/191d8984bf7a460a9db00132b6931d0f
Autor:
Lv Xin
Publikováno v:
Applied Mathematics and Nonlinear Sciences, Vol 9, Iss 1 (2024)
This paper proposes an adaptive learning model for the interrelationship between civic education and college students’ career planning. Secondly, a competency point tracking model is constructed based on a neural network algorithm to explore the re
Externí odkaz:
https://doaj.org/article/a0e1dc8452ba4fe98acdc9f148f65e6b
Autor:
Zhang, Jiajie, Hou, Zhongni, Lv, Xin, Cao, Shulin, Hou, Zhenyu, Niu, Yilin, Hou, Lei, Dong, Yuxiao, Feng, Ling, Li, Juanzi
Though significant advancements have been achieved in developing long-context large language models (LLMs), the compromised quality of LLM-synthesized data for supervised fine-tuning (SFT) often affects the long-context performance of SFT models and
Externí odkaz:
http://arxiv.org/abs/2410.21252
Knowledge distillation (KD) aims to transfer knowledge from a large teacher model to a smaller student model. Previous work applying KD in the field of large language models (LLMs) typically focused on the post-training phase, where the student LLM l
Externí odkaz:
http://arxiv.org/abs/2410.16215
Autor:
Liu, Jiyuan, Liu, Xinwang, Wang, Siqi, Hu, Xingchen, Liao, Qing, Wan, Xinhang, Zhang, Yi, Lv, Xin, He, Kunlun
Vertical federated learning is a natural and elegant approach to integrate multi-view data vertically partitioned across devices (clients) while preserving their privacies. Apart from the model training, existing methods requires the collaboration of
Externí odkaz:
http://arxiv.org/abs/2409.04111
Autor:
Zhang, Jiajie, Bai, Yushi, Lv, Xin, Gu, Wanjun, Liu, Danqing, Zou, Minhao, Cao, Shulin, Hou, Lei, Dong, Yuxiao, Feng, Ling, Li, Juanzi
Though current long-context large language models (LLMs) have demonstrated impressive capacities in answering user questions based on extensive text, the lack of citations in their responses makes user verification difficult, leading to concerns abou
Externí odkaz:
http://arxiv.org/abs/2409.02897
Publikováno v:
Zeitschrift für Kristallographie - New Crystal Structures, Vol 236, Iss 4, Pp 721-723 (2021)
C17H16CuN6OS ⋅ 0.5H2O, monoclinic, P21/c (no. 14), a = 10.3523(4) Å, b = 18.2609(9) Å, c = 9.9688(4) Å, β = 103.918(4)°, Z = 4, V = 1829.21(13) Å3, Rgt(F) = 0.0409, wRref(F2) = 0.1099, T = 291(2) K.
Externí odkaz:
https://doaj.org/article/2ae2ce8c420a4f08b4f916c6a2e7ef54
Publikováno v:
Zeitschrift für Kristallographie - New Crystal Structures, Vol 236, Iss 4, Pp 713-715 (2021)
C34H32Cu2N12O2S2, triclinic, P1‾$P‾{1}$ (no. 2), a = 7.7970(7) Å, b = 9.6110(9) Å, c = 12.7629(12) Å, α = 71.544(8)°, β = 79.322(8)°, γ = 83.734(8)°, Z = 1, V = 890.21(14) Å3, Rgt(F) = 0.0634, wRref(F2) = 0.1827, T = 291(2) K.
Externí odkaz:
https://doaj.org/article/e804382cfc3a407195d60223b0f6a226
Autor:
Bai, Yushi, Zhang, Jiajie, Lv, Xin, Zheng, Linzhi, Zhu, Siqi, Hou, Lei, Dong, Yuxiao, Tang, Jie, Li, Juanzi
Current long context large language models (LLMs) can process inputs up to 100,000 tokens, yet struggle to generate outputs exceeding even a modest length of 2,000 words. Through controlled experiments, we find that the model's effective generation l
Externí odkaz:
http://arxiv.org/abs/2408.07055
Autor:
GLM, Team, Zeng, Aohan, Xu, Bin, Wang, Bowen, Zhang, Chenhui, Yin, Da, Zhang, Dan, Rojas, Diego, Feng, Guanyu, Zhao, Hanlin, Lai, Hanyu, Yu, Hao, Wang, Hongning, Sun, Jiadai, Zhang, Jiajie, Cheng, Jiale, Gui, Jiayi, Tang, Jie, Zhang, Jing, Sun, Jingyu, Li, Juanzi, Zhao, Lei, Wu, Lindong, Zhong, Lucen, Liu, Mingdao, Huang, Minlie, Zhang, Peng, Zheng, Qinkai, Lu, Rui, Duan, Shuaiqi, Zhang, Shudan, Cao, Shulin, Yang, Shuxun, Tam, Weng Lam, Zhao, Wenyi, Liu, Xiao, Xia, Xiao, Zhang, Xiaohan, Gu, Xiaotao, Lv, Xin, Liu, Xinghan, Liu, Xinyi, Yang, Xinyue, Song, Xixuan, Zhang, Xunkai, An, Yifan, Xu, Yifan, Niu, Yilin, Yang, Yuantao, Li, Yueyan, Bai, Yushi, Dong, Yuxiao, Qi, Zehan, Wang, Zhaoyu, Yang, Zhen, Du, Zhengxiao, Hou, Zhenyu, Wang, Zihan
We introduce ChatGLM, an evolving family of large language models that we have been developing over time. This report primarily focuses on the GLM-4 language series, which includes GLM-4, GLM-4-Air, and GLM-4-9B. They represent our most capable model
Externí odkaz:
http://arxiv.org/abs/2406.12793