Zobrazeno 1 - 10
of 321
pro vyhledávání: '"Xie, Zhenda"'
Autor:
DeepSeek-AI, Zhu, Qihao, Guo, Daya, Shao, Zhihong, Yang, Dejian, Wang, Peiyi, Xu, Runxin, Wu, Y., Li, Yukun, Gao, Huazuo, Ma, Shirong, Zeng, Wangding, Bi, Xiao, Gu, Zihui, Xu, Hanwei, Dai, Damai, Dong, Kai, Zhang, Liyue, Piao, Yishi, Gou, Zhibin, Xie, Zhenda, Hao, Zhewen, Wang, Bingxuan, Song, Junxiao, Chen, Deli, Xie, Xin, Guan, Kang, You, Yuxiang, Liu, Aixin, Du, Qiushi, Gao, Wenjun, Lu, Xuan, Chen, Qinyu, Wang, Yaohui, Deng, Chengqi, Li, Jiashi, Zhao, Chenggang, Ruan, Chong, Luo, Fuli, Liang, Wenfeng
We present DeepSeek-Coder-V2, an open-source Mixture-of-Experts (MoE) code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks. Specifically, DeepSeek-Coder-V2 is further pre-trained from an intermediate checkpoin
Externí odkaz:
http://arxiv.org/abs/2406.11931
Autor:
DeepSeek-AI, Liu, Aixin, Feng, Bei, Wang, Bin, Wang, Bingxuan, Liu, Bo, Zhao, Chenggang, Dengr, Chengqi, Ruan, Chong, Dai, Damai, Guo, Daya, Yang, Dejian, Chen, Deli, Ji, Dongjie, Li, Erhang, Lin, Fangyun, Luo, Fuli, Hao, Guangbo, Chen, Guanting, Li, Guowei, Zhang, H., Xu, Hanwei, Yang, Hao, Zhang, Haowei, Ding, Honghui, Xin, Huajian, Gao, Huazuo, Li, Hui, Qu, Hui, Cai, J. L., Liang, Jian, Guo, Jianzhong, Ni, Jiaqi, Li, Jiashi, Chen, Jin, Yuan, Jingyang, Qiu, Junjie, Song, Junxiao, Dong, Kai, Gao, Kaige, Guan, Kang, Wang, Lean, Zhang, Lecong, Xu, Lei, Xia, Leyi, Zhao, Liang, Zhang, Liyue, Li, Meng, Wang, Miaojun, Zhang, Mingchuan, Zhang, Minghua, Tang, Minghui, Li, Mingming, Tian, Ning, Huang, Panpan, Wang, Peiyi, Zhang, Peng, Zhu, Qihao, Chen, Qinyu, Du, Qiushi, Chen, R. J., Jin, R. L., Ge, Ruiqi, Pan, Ruizhe, Xu, Runxin, Chen, Ruyi, Li, S. S., Lu, Shanghao, Zhou, Shangyan, Chen, Shanhuang, Wu, Shaoqing, Ye, Shengfeng, Ma, Shirong, Wang, Shiyu, Zhou, Shuang, Yu, Shuiping, Zhou, Shunfeng, Zheng, Size, Wang, T., Pei, Tian, Yuan, Tian, Sun, Tianyu, Xiao, W. L., Zeng, Wangding, An, Wei, Liu, Wen, Liang, Wenfeng, Gao, Wenjun, Zhang, Wentao, Li, X. Q., Jin, Xiangyue, Wang, Xianzu, Bi, Xiao, Liu, Xiaodong, Wang, Xiaohan, Shen, Xiaojin, Chen, Xiaokang, Chen, Xiaosha, Nie, Xiaotao, Sun, Xiaowen, Wang, Xiaoxiang, Liu, Xin, Xie, Xin, Yu, Xingkai, Song, Xinnan, Zhou, Xinyi, Yang, Xinyu, Lu, Xuan, Su, Xuecheng, Wu, Y., Li, Y. K., Wei, Y. X., Zhu, Y. X., Xu, Yanhong, Huang, Yanping, Li, Yao, Zhao, Yao, Sun, Yaofeng, Li, Yaohui, Wang, Yaohui, Zheng, Yi, Zhang, Yichao, Xiong, Yiliang, Zhao, Yilong, He, Ying, Tang, Ying, Piao, Yishi, Dong, Yixin, Tan, Yixuan, Liu, Yiyuan, Wang, Yongji, Guo, Yongqiang, Zhu, Yuchen, Wang, Yuduan, Zou, Yuheng, Zha, Yukun, Ma, Yunxian, Yan, Yuting, You, Yuxiang, Liu, Yuxuan, Ren, Z. Z., Ren, Zehui, Sha, Zhangli, Fu, Zhe, Huang, Zhen, Zhang, Zhen, Xie, Zhenda, Hao, Zhewen, Shao, Zhihong, Wen, Zhiniu, Xu, Zhipeng, Zhang, Zhongyu, Li, Zhuoshu, Wang, Zihan, Gu, Zihui, Li, Zilin, Xie, Ziwei
We present DeepSeek-V2, a strong Mixture-of-Experts (MoE) language model characterized by economical training and efficient inference. It comprises 236B total parameters, of which 21B are activated for each token, and supports a context length of 128
Externí odkaz:
http://arxiv.org/abs/2405.04434
Autor:
Zhang, Xiaofan, Zhang, Fan, Jia, Kunpeng, Liu, Yunfeng, shi, Haosen, Jiang, Yanyi, Jiang, Xiaoshun, Ma, Longsheng, Liang, Wei, Xie, Zhenda, Zhu, Shi-ning
Self-injection locking scheme has the potential to narrow the linewidth of lasers in a compact setup. Here, we report a narrow linewidth laser source near 1 {\mu}m by self-injection locking scheme using a Fabry-Perot (FP) hollow resonator with a high
Externí odkaz:
http://arxiv.org/abs/2403.06163
Autor:
Lu, Haoyu, Liu, Wen, Zhang, Bo, Wang, Bingxuan, Dong, Kai, Liu, Bo, Sun, Jingxiang, Ren, Tongzheng, Li, Zhuoshu, Yang, Hao, Sun, Yaofeng, Deng, Chengqi, Xu, Hanwei, Xie, Zhenda, Ruan, Chong
We present DeepSeek-VL, an open-source Vision-Language (VL) Model designed for real-world vision and language understanding applications. Our approach is structured around three key dimensions: We strive to ensure our data is diverse, scalable, and e
Externí odkaz:
http://arxiv.org/abs/2403.05525
Autor:
Guo, Daya, Zhu, Qihao, Yang, Dejian, Xie, Zhenda, Dong, Kai, Zhang, Wentao, Chen, Guanting, Bi, Xiao, Wu, Y., Li, Y. K., Luo, Fuli, Xiong, Yingfei, Liang, Wenfeng
The rapid development of large language models has revolutionized code intelligence in software development. However, the predominance of closed-source models has restricted extensive research and development. To address this, we introduce the DeepSe
Externí odkaz:
http://arxiv.org/abs/2401.14196
Autor:
Dai, Damai, Deng, Chengqi, Zhao, Chenggang, Xu, R. X., Gao, Huazuo, Chen, Deli, Li, Jiashi, Zeng, Wangding, Yu, Xingkai, Wu, Y., Xie, Zhenda, Li, Y. K., Huang, Panpan, Luo, Fuli, Ruan, Chong, Sui, Zhifang, Liang, Wenfeng
In the era of large language models, Mixture-of-Experts (MoE) is a promising architecture for managing computational costs when scaling up model parameters. However, conventional MoE architectures like GShard, which activate the top-$K$ out of $N$ ex
Externí odkaz:
http://arxiv.org/abs/2401.06066
Autor:
DeepSeek-AI, Bi, Xiao, Chen, Deli, Chen, Guanting, Chen, Shanhuang, Dai, Damai, Deng, Chengqi, Ding, Honghui, Dong, Kai, Du, Qiushi, Fu, Zhe, Gao, Huazuo, Gao, Kaige, Gao, Wenjun, Ge, Ruiqi, Guan, Kang, Guo, Daya, Guo, Jianzhong, Hao, Guangbo, Hao, Zhewen, He, Ying, Hu, Wenjie, Huang, Panpan, Li, Erhang, Li, Guowei, Li, Jiashi, Li, Yao, Li, Y. K., Liang, Wenfeng, Lin, Fangyun, Liu, A. X., Liu, Bo, Liu, Wen, Liu, Xiaodong, Liu, Xin, Liu, Yiyuan, Lu, Haoyu, Lu, Shanghao, Luo, Fuli, Ma, Shirong, Nie, Xiaotao, Pei, Tian, Piao, Yishi, Qiu, Junjie, Qu, Hui, Ren, Tongzheng, Ren, Zehui, Ruan, Chong, Sha, Zhangli, Shao, Zhihong, Song, Junxiao, Su, Xuecheng, Sun, Jingxiang, Sun, Yaofeng, Tang, Minghui, Wang, Bingxuan, Wang, Peiyi, Wang, Shiyu, Wang, Yaohui, Wang, Yongji, Wu, Tong, Wu, Y., Xie, Xin, Xie, Zhenda, Xie, Ziwei, Xiong, Yiliang, Xu, Hanwei, Xu, R. X., Xu, Yanhong, Yang, Dejian, You, Yuxiang, Yu, Shuiping, Yu, Xingkai, Zhang, B., Zhang, Haowei, Zhang, Lecong, Zhang, Liyue, Zhang, Mingchuan, Zhang, Minghua, Zhang, Wentao, Zhang, Yichao, Zhao, Chenggang, Zhao, Yao, Zhou, Shangyan, Zhou, Shunfeng, Zhu, Qihao, Zou, Yuheng
The rapid development of open-source large language models (LLMs) has been truly remarkable. However, the scaling law described in previous literature presents varying conclusions, which casts a dark cloud over scaling LLMs. We delve into the study o
Externí odkaz:
http://arxiv.org/abs/2401.02954
Autor:
Zhao, Zexing, Wang, Chenyu, Qiu, Jingyuan, Ye, Zhilin, Yin, Zhijun, Jia, Kunpeng, Tian, Xiaohui, Xie, Zhenda, Zhu, Shi-Ning
Optical frequency comb based on microresonator (microcomb) is an integrated coherent light source and has the potential to promise a high-precision frequency standard, and self-reference and long-term stable microcomb is the key to this realization.
Externí odkaz:
http://arxiv.org/abs/2311.14568
We present DreamCraft3D, a hierarchical 3D content generation method that produces high-fidelity and coherent 3D objects. We tackle the problem by leveraging a 2D reference image to guide the stages of geometry sculpting and texture boosting. A centr
Externí odkaz:
http://arxiv.org/abs/2310.16818
Autor:
Cheng, Xiang, Chang, Kai-Chi, Xie, Zhenda, Sarihan, Murat Can, Lee, Yoo Seung, Li, Yongnan, Xu, XinAn, Vinod, Abhinav Kumar, Kocaman, Serdar, Yu, Mingbin, Lo, Patrick Guo-Qiang, Kwong, Dim-Lee, Shapiro, Jeffrey H., Wong, Franco N. C., Wong, Chee Wei
Publikováno v:
Nat. Photon. 17, 656-665 (2023)
Recent progress in quantum computing and networking enables high-performance large-scale quantum processors by connecting different quantum modules. Optical quantum systems show advantages in both computing and communications, and integrated quantum
Externí odkaz:
http://arxiv.org/abs/2305.09812