Zobrazeno 1 - 10
of 685
pro vyhledávání: '"Shen, JianPing"'
Autor:
Zhou, Jie, Vincent, Daniel, Acharya, Sudip, Ojo, Solomon, Abrand, Alireza, Liu, Yang, Gong, Jiarui, Liu, Dong, Haessly, Samuel, Shen, Jianping, Xu, Shining, Li, Yiran, Lu, Yi, Stanchu, Hryhorii, Mawst, Luke, Claflin, Bruce, Mohseni, Parsian K., Ma, Zhenqiang, Yu, Shui-Qing
Group IV GeSn double-heterostructure (DHS) lasers offer unique advantages of a direct bandgap and CMOS compatibility. However, further improvements in laser performance have been bottlenecked by limited junction properties of GeSn through conventiona
Externí odkaz:
http://arxiv.org/abs/2409.09752
Data augmentation (DA) aims to generate constrained and diversified data to improve classifiers in Low-Resource Classification (LRC). Previous studies mostly use a fine-tuned Language Model (LM) to strengthen the constraints but ignore the fact that
Externí odkaz:
http://arxiv.org/abs/2109.11834
Publikováno v:
In Studies in Educational Evaluation June 2024 81
Publikováno v:
In Experimental Cell Research 15 May 2024 438(2)
Autor:
Hao, Yu, Zhang, Yue, Li, Bingyan, Chuan, Huiyan, Wang, Zhaomin, Shen, Jianping, Chen, Zhe, Xie, Ping, Liu, Yong
Publikováno v:
In Journal of Environmental Management May 2024 358
Autor:
Zhou, Qian, Hao, Guoliang, Xie, Wensen, Chen, Bin, Lu, Wuguang, Wang, Gongxin, Zhong, Rongling, Chen, Jiao, Ye, Juan, Shen, Jianping, Cao, Peng
Publikováno v:
In Journal of Biological Chemistry May 2024 300(5)
Recently developed large pre-trained language models, e.g., BERT, have achieved remarkable performance in many downstream natural language processing applications. These pre-trained language models often contain hundreds of millions of parameters and
Externí odkaz:
http://arxiv.org/abs/2106.08898
It is desirable to include more controllable attributes to enhance the diversity of generated responses in open-domain dialogue systems. However, existing methods can generate responses with only one controllable attribute or lack a flexible way to g
Externí odkaz:
http://arxiv.org/abs/2106.14614
This paper presents the PALI team's winning system for SemEval-2021 Task 2: Multilingual and Cross-lingual Word-in-Context Disambiguation. We fine-tune XLM-RoBERTa model to solve the task of word in context disambiguation, i.e., to determine whether
Externí odkaz:
http://arxiv.org/abs/2104.10375
Autor:
Yang, Haiqin, Shen, Jianping
Emotion dynamics modeling is a significant task in emotion recognition in conversation. It aims to predict conversational emotions when building empathetic dialogue systems. Existing studies mainly develop models based on Recurrent Neural Networks (R
Externí odkaz:
http://arxiv.org/abs/2104.07252