Zobrazeno 1 - 10
of 418
pro vyhledávání: '"REN Liliang"'
Publikováno v:
Shuitu Baochi Xuebao, Vol 38, Iss 2, Pp 165-177 (2024)
[Objective] The study on the spatial-temporal change characteristics of land use/cover is of great significance for the protection and rational development of land resources in the Yellow River Basin, and can provide an important reference for the im
Externí odkaz:
https://doaj.org/article/84fc90b482e74d34a100cc4842405e13
Autor:
Lin, Xihui, Zhang, Yunan, Ge, Suyu, Ren, Liliang, Patra, Barun, Chaudhary, Vishrav, Peng, Hao, Song, Xia
Sparse attention, which selectively attends to a subset of tokens in the context was supposed to be efficient. However, its theoretical reduction in FLOPs has rarely translated into wall-clock speed-up over its dense attention counterparts due to the
Externí odkaz:
http://arxiv.org/abs/2407.17678
Efficiently modeling sequences with infinite context length has been a long-standing problem. Past works suffer from either the quadratic computation complexity or the limited extrapolation ability on length generalization. In this work, we present S
Externí odkaz:
http://arxiv.org/abs/2406.07522
Autor:
Abdin, Marah, Aneja, Jyoti, Awadalla, Hany, Awadallah, Ahmed, Awan, Ammar Ahmad, Bach, Nguyen, Bahree, Amit, Bakhtiari, Arash, Bao, Jianmin, Behl, Harkirat, Benhaim, Alon, Bilenko, Misha, Bjorck, Johan, Bubeck, Sébastien, Cai, Martin, Cai, Qin, Chaudhary, Vishrav, Chen, Dong, Chen, Dongdong, Chen, Weizhu, Chen, Yen-Chun, Chen, Yi-Ling, Cheng, Hao, Chopra, Parul, Dai, Xiyang, Dixon, Matthew, Eldan, Ronen, Fragoso, Victor, Gao, Jianfeng, Gao, Mei, Gao, Min, Garg, Amit, Del Giorno, Allie, Goswami, Abhishek, Gunasekar, Suriya, Haider, Emman, Hao, Junheng, Hewett, Russell J., Hu, Wenxiang, Huynh, Jamie, Iter, Dan, Jacobs, Sam Ade, Javaheripi, Mojan, Jin, Xin, Karampatziakis, Nikos, Kauffmann, Piero, Khademi, Mahoud, Kim, Dongwoo, Kim, Young Jin, Kurilenko, Lev, Lee, James R., Lee, Yin Tat, Li, Yuanzhi, Li, Yunsheng, Liang, Chen, Liden, Lars, Lin, Xihui, Lin, Zeqi, Liu, Ce, Liu, Liyuan, Liu, Mengchen, Liu, Weishung, Liu, Xiaodong, Luo, Chong, Madan, Piyush, Mahmoudzadeh, Ali, Majercak, David, Mazzola, Matt, Mendes, Caio César Teodoro, Mitra, Arindam, Modi, Hardik, Nguyen, Anh, Norick, Brandon, Patra, Barun, Perez-Becker, Daniel, Portet, Thomas, Pryzant, Reid, Qin, Heyang, Radmilac, Marko, Ren, Liliang, de Rosa, Gustavo, Rosset, Corby, Roy, Sambudha, Ruwase, Olatunji, Saarikivi, Olli, Saied, Amin, Salim, Adil, Santacroce, Michael, Shah, Shital, Shang, Ning, Sharma, Hiteshi, Shen, Yelong, Shukla, Swadheen, Song, Xia, Tanaka, Masahiro, Tupini, Andrea, Vaddamanu, Praneetha, Wang, Chunyu, Wang, Guanhua, Wang, Lijuan, Wang, Shuohang, Wang, Xin, Wang, Yu, Ward, Rachel, Wen, Wen, Witte, Philipp, Wu, Haiping, Wu, Xiaoxia, Wyatt, Michael, Xiao, Bin, Xu, Can, Xu, Jiahang, Xu, Weijian, Xue, Jilong, Yadav, Sonali, Yang, Fan, Yang, Jianwei, Yang, Yifan, Yang, Ziyi, Yu, Donghan, Yuan, Lu, Zhang, Chenruidong, Zhang, Cyril, Zhang, Jianwen, Zhang, Li Lyna, Zhang, Yi, Zhang, Yue, Zhang, Yunan, Zhou, Xiren
We introduce phi-3-mini, a 3.8 billion parameter language model trained on 3.3 trillion tokens, whose overall performance, as measured by both academic benchmarks and internal testing, rivals that of models such as Mixtral 8x7B and GPT-3.5 (e.g., phi
Externí odkaz:
http://arxiv.org/abs/2404.14219
Existing reference-free turn-level evaluation metrics for chatbots inadequately capture the interaction between the user and the system. Consequently, they often correlate poorly with human evaluations. To address this issue, we propose a novel model
Externí odkaz:
http://arxiv.org/abs/2306.15245
Recent hybrid models combining Linear State Space Models (SSMs) with self-attention mechanisms have demonstrated impressive results across a range of sequence modeling tasks. However, current approaches apply attention modules statically and uniforml
Externí odkaz:
http://arxiv.org/abs/2306.11197
Modern large-scale Pre-trained Language Models (PLMs) have achieved tremendous success on a wide range of downstream tasks. However, most of the LM pre-training objectives only focus on text reconstruction, but have not sought to learn latent-level i
Externí odkaz:
http://arxiv.org/abs/2210.12582
Publikováno v:
Water Science and Engineering, Vol 1, Iss 4, Pp 1-13 (2008)
This study simulated and predicted the runoff of the Aksu River Basin, a typical river basin supplied by snowmelt in an arid mountain region, with a limited data set and few hydrological and meteorological stations. Two hydrological models, the snowm
Externí odkaz:
https://doaj.org/article/faadc51a261243fead9ae4a1171553a5
Autor:
Zhu, Yongwei, Jiang, Shanhu, Ren, Liliang, Guo, Jianying, Zhong, Feng, Du, Shuping, Cui, Hao, He, Miao, Duan, Zheng
Publikováno v:
In Science of the Total Environment 15 November 2024 951
Autor:
Zhan, Hao, Yu, Dongxue, Wang, Le, Zhang, Jiang, Xu, Min, Fang, Xiuqin, Xue, Kai, Yan, Yiqi, Ren, Liliang, Wang, Yanfen, Zhu, Qiuan
Publikováno v:
In Journal of Hydrology October 2024 642