Zobrazeno 1 - 10
of 322
pro vyhledávání: '"HAN WenTao"'
Autor:
Xie, Yanzhou, Fu, Haiqiang, Zhu, Jianjun, Wang, Changcheng, Xie, Qinghua, Wan, Jie, Han, Wentao
Publikováno v:
In Remote Sensing of Environment 1 October 2024 312
Autor:
Dunuwille, Wangisa, Wilson, William C., Bjeije, Hassan, Issa, Nancy, Han, Wentao, Parsons, Tyler M., Young, Andrew L., Xavier Raj, Infencia, Krishnan, Aishwarya, Gaur, Tarang, Wang, Eunice S., Weng, Andrew P., Stubbs, Matthew C., Celik, Hamza, Cashen, Amanda F., Edwards, John R., Challen, Grant A.
Publikováno v:
In Blood Neoplasia December 2024 1(4)
Publikováno v:
In Mechanical Systems and Signal Processing 1 November 2024 220
Autor:
Zhang, Zhengyan, Gu, Yuxian, Han, Xu, Chen, Shengqi, Xiao, Chaojun, Sun, Zhenbo, Yao, Yuan, Qi, Fanchao, Guan, Jian, Ke, Pei, Cai, Yanzheng, Zeng, Guoyang, Tan, Zhixing, Liu, Zhiyuan, Huang, Minlie, Han, Wentao, Liu, Yang, Zhu, Xiaoyan, Sun, Maosong
In recent years, the size of pre-trained language models (PLMs) has grown by leaps and bounds. However, efficiency issues of these large-scale PLMs limit their utilization in real-world scenarios. We present a suite of cost-effective techniques for t
Externí odkaz:
http://arxiv.org/abs/2106.10715
Autor:
Han, Xu, Zhang, Zhengyan, Ding, Ning, Gu, Yuxian, Liu, Xiao, Huo, Yuqi, Qiu, Jiezhong, Yao, Yuan, Zhang, Ao, Zhang, Liang, Han, Wentao, Huang, Minlie, Jin, Qin, Lan, Yanyan, Liu, Yang, Liu, Zhiyuan, Lu, Zhiwu, Qiu, Xipeng, Song, Ruihua, Tang, Jie, Wen, Ji-Rong, Yuan, Jinhui, Zhao, Wayne Xin, Zhu, Jun
Large-scale pre-trained models (PTMs) such as BERT and GPT have recently achieved great success and become a milestone in the field of artificial intelligence (AI). Owing to sophisticated pre-training objectives and huge model parameters, large-scale
Externí odkaz:
http://arxiv.org/abs/2106.07139
Autor:
Zhang, Zhengyan, Han, Xu, Zhou, Hao, Ke, Pei, Gu, Yuxian, Ye, Deming, Qin, Yujia, Su, Yusheng, Ji, Haozhe, Guan, Jian, Qi, Fanchao, Wang, Xiaozhi, Zheng, Yanan, Zeng, Guoyang, Cao, Huanqi, Chen, Shengqi, Li, Daixuan, Sun, Zhenbo, Liu, Zhiyuan, Huang, Minlie, Han, Wentao, Tang, Jie, Li, Juanzi, Zhu, Xiaoyan, Sun, Maosong
Pre-trained Language Models (PLMs) have proven to be beneficial for various downstream NLP tasks. Recently, GPT-3, with 175 billion parameters and 570GB training data, drew a lot of attention due to the capacity of few-shot (even zero-shot) learning.
Externí odkaz:
http://arxiv.org/abs/2012.00413
Autor:
Feng, Guanyu, Ma, Zixuan, Li, Daixuan, Chen, Shengqi, Zhu, Xiaowei, Han, Wentao, Chen, Wenguang
Publikováno v:
SIGMOD Conference 2021: 513-527
Evolving graphs in the real world are large-scale and constantly changing, as hundreds of thousands of updates may come every second. Monotonic algorithms such as Reachability and Shortest Path are widely used in real-time analytics to gain both stat
Externí odkaz:
http://arxiv.org/abs/2004.00803
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Publikováno v:
In ISPRS Journal of Photogrammetry and Remote Sensing August 2023 202:314-333
Publikováno v:
In Microelectronics Journal June 2023 136