Zobrazeno 1 - 10
of 33
pro vyhledávání: '"Gao, Pengzhi"'
Autor:
Du, Jiangshu, Wang, Yibo, Zhao, Wenting, Deng, Zhongfen, Liu, Shuaiqi, Lou, Renze, Zou, Henry Peng, Venkit, Pranav Narayanan, Zhang, Nan, Srinath, Mukund, Zhang, Haoran Ranran, Gupta, Vipul, Li, Yinghui, Li, Tao, Wang, Fei, Liu, Qin, Liu, Tianlin, Gao, Pengzhi, Xia, Congying, Xing, Chen, Cheng, Jiayang, Wang, Zhaowei, Su, Ying, Shah, Raj Sanjay, Guo, Ruohao, Gu, Jing, Li, Haoran, Wei, Kangda, Wang, Zihao, Cheng, Lu, Ranathunga, Surangika, Fang, Meng, Fu, Jie, Liu, Fei, Huang, Ruihong, Blanco, Eduardo, Cao, Yixin, Zhang, Rui, Yu, Philip S., Yin, Wenpeng
This work is motivated by two key trends. On one hand, large language models (LLMs) have shown remarkable versatility in various generative tasks such as writing, drawing, and question answering, significantly reducing the time required for many rout
Externí odkaz:
http://arxiv.org/abs/2406.16253
The training paradigm for machine translation has gradually shifted, from learning neural machine translation (NMT) models with extensive parallel corpora to instruction finetuning on multilingual large language models (LLMs) with high-quality transl
Externí odkaz:
http://arxiv.org/abs/2401.05861
Consistency regularization methods, such as R-Drop (Liang et al., 2021) and CrossConST (Gao et al., 2023), have achieved impressive supervised and zero-shot performance in the neural machine translation (NMT) field. Can we also boost end-to-end (E2E)
Externí odkaz:
http://arxiv.org/abs/2308.14482
Multilingual sentence representations are the foundation for similarity-based bitext mining, which is crucial for scaling multilingual neural machine translation (NMT) system to more languages. In this paper, we introduce MuSR: a one-for-all Multilin
Externí odkaz:
http://arxiv.org/abs/2306.06919
The multilingual neural machine translation (NMT) model has a promising capability of zero-shot translation, where it could directly translate between language pairs unseen during training. For good transfer performance from supervised directions to
Externí odkaz:
http://arxiv.org/abs/2305.07310
We introduce Bi-SimCut: a simple but effective training strategy to boost neural machine translation (NMT) performance. It consists of two procedures: bidirectional pretraining and unidirectional finetuning. Both procedures utilize SimCut, a simple r
Externí odkaz:
http://arxiv.org/abs/2206.02368
Diverse machine translation aims at generating various target language translations for a given source language sentence. Leveraging the linear relationship in the sentence latent space introduced by the mixup training, we propose a novel method, Mix
Externí odkaz:
http://arxiv.org/abs/2109.03402
Autor:
Liu, Zhengzhong, Ding, Guanxiong, Bukkittu, Avinash, Gupta, Mansi, Gao, Pengzhi, Ahmed, Atif, Zhang, Shikun, Gao, Xin, Singhavi, Swapnil, Li, Linwei, Wei, Wei, Hu, Zecong, Shi, Haoran, Zhang, Haoying, Liang, Xiaodan, Mitamura, Teruko, Xing, Eric P., Hu, Zhiting
Empirical natural language processing (NLP) systems in application domains (e.g., healthcare, finance, education) involve interoperation among multiple components, ranging from data ingestion, human annotation, to text retrieval, analysis, generation
Externí odkaz:
http://arxiv.org/abs/2103.01834
Autor:
Gao, Pengzhi, Wang, Meng, Chow, Joe H., Ghiocel, Scott G., Fardanesh, Bruce, Stefopoulos, George, Razanousky, Michael P.
This paper presents a new framework of identifying a series of cyber data attacks on power system synchrophasor measurements. We focus on detecting "unobservable" cyber data attacks that cannot be detected by any existing method that purely relies on
Externí odkaz:
http://arxiv.org/abs/1607.04776
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.