Zobrazeno 1 - 10
of 104
pro vyhledávání: '"Wu Yimeng"'
Publikováno v:
E3S Web of Conferences, Vol 536, p 01024 (2024)
In order to reduce the coal dust pollution when the heavy train enters the tunnel and to ensure the safety of railroad operation, this paper adopts numerical simulation to study the characteristics of the wind flow field of the heavy train passing th
Externí odkaz:
https://doaj.org/article/701706ba8ff148e781af9b5ea40aced9
Autor:
Zhang, Zixing, Xu, Weixiang, Dong, Zhongren, Wang, Kanglin, Wu, Yimeng, Peng, Jing, Wang, Runming, Huang, Dong-Yan
Computational paralinguistics (ComParal) aims to develop algorithms and models to automatically detect, analyze, and interpret non-verbal information from speech communication, e. g., emotion, health state, age, and gender. Despite its rapid progress
Externí odkaz:
http://arxiv.org/abs/2411.09349
Autor:
Peng, Yu, Xin, Jiaxun, Peng, Nanyi, Li, Yanyi, Huang, Jijiao, Zhang, Ruiqiang, Li, Chen, Wu, Yimeng, Gong, Bingzhang, Wang, Ronghui
Publikováno v:
Diversity and Distributions, 2024 Jan 01. 30(1), 119-133.
Externí odkaz:
https://www.jstor.org/stable/48754151
Autor:
Alghamdi, Asaad, Duan, Xinyu, Jiang, Wei, Wang, Zhenhai, Wu, Yimeng, Xia, Qingrong, Wang, Zhefeng, Zheng, Yi, Rezagholizadeh, Mehdi, Huai, Baoxing, Cheng, Peilun, Ghaddar, Abbas
Developing monolingual large Pre-trained Language Models (PLMs) is shown to be very successful in handling different tasks in Natural Language Processing (NLP). In this work, we present AraMUS, the largest Arabic PLM with 11B parameters trained on 52
Externí odkaz:
http://arxiv.org/abs/2306.06800
Autor:
Ghaddar, Abbas, Wu, Yimeng, Bagga, Sunyam, Rashid, Ahmad, Bibi, Khalil, Rezagholizadeh, Mehdi, Xing, Chao, Wang, Yasheng, Xinyu, Duan, Wang, Zhefeng, Huai, Baoxing, Jiang, Xin, Liu, Qun, Langlais, Philippe
There is a growing body of work in recent years to develop pre-trained language models (PLMs) for the Arabic language. This work concerns addressing two major problems in existing Arabic PLMs which constraint progress of the Arabic NLU and NLG fields
Externí odkaz:
http://arxiv.org/abs/2205.10687
Publikováno v:
In Fuel 15 October 2024 374
Publikováno v:
In Combustion and Flame September 2024 267
Autor:
Ghaddar, Abbas, Wu, Yimeng, Rashid, Ahmad, Bibi, Khalil, Rezagholizadeh, Mehdi, Xing, Chao, Wang, Yasheng, Xinyu, Duan, Wang, Zhefeng, Huai, Baoxing, Jiang, Xin, Liu, Qun, Langlais, Philippe
Language-specific pre-trained models have proven to be more accurate than multilingual ones in a monolingual evaluation setting, Arabic is no exception. However, we found that previously released Arabic BERT models were significantly under-trained. I
Externí odkaz:
http://arxiv.org/abs/2112.04329
Publikováno v:
In Ecological Indicators April 2024 161
Knowledge distillation is considered as a training and compression strategy in which two neural networks, namely a teacher and a student, are coupled together during training. The teacher network is supposed to be a trustworthy predictor and the stud
Externí odkaz:
http://arxiv.org/abs/2012.14022