Zobrazeno 1 - 6
of 6
pro vyhledávání: '"Abteen Ebrahimi"'
Autor:
Katharina Kann, Abteen Ebrahimi, Manuel Mager, Arturo Oncevay, John E. Ortega, Annette Rios, Angela Fan, Ximena Gutierrez-Vasques, Luis Chiruzzo, Gustavo A. Giménez-Lugo, Ricardo Ramos, Ivan Vladimir Meza Ruiz, Elisabeth Mager, Vishrav Chaudhary, Graham Neubig, Alexis Palmer, Rolando Coto-Solano, Ngoc Thang Vu
Publikováno v:
Frontiers in Artificial Intelligence, Vol 5 (2022)
Little attention has been paid to the development of human language technology for truly low-resource languages—i.e., languages with limited amounts of digitally available text data, such as Indigenous languages. However, it has been shown that pre
Externí odkaz:
https://doaj.org/article/3d0de02183b848f3900a562e6f99d495
Publikováno v:
Proceedings of the 4th Workshop on NLP for Conversational AI.
Autor:
Abteen Ebrahimi, Katharina Kann
Publikováno v:
ACL/IJCNLP (1)
Pretrained multilingual models (PMMs) enable zero-shot learning via cross-lingual transfer, performing best for languages seen during pretraining. While methods exist to improve performance for unseen languages, they have almost exclusively been eval
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::55785e7b574851199a814a36fe84bea3
http://arxiv.org/abs/2106.02124
http://arxiv.org/abs/2106.02124
Autor:
Abteen Ebrahimi, Manuel Mager, Arturo Oncevay, Vishrav Chaudhary, Luis Chiruzzo, Angela Fan, John Ortega, Ricardo Ramos, Annette Rios, Ivan Vladimir Meza Ruiz, Gustavo Giménez-Lugo, Elisabeth Mager, Graham Neubig, Alexis Palmer, Rolando Coto-Solano, Thang Vu, Katharina Kann
Pretrained multilingual models are able to perform cross-lingual transfer in a zero-shot setting, even for languages unseen during pretraining. However, prior work evaluating performance on unseen languages has largely been limited to low-level, synt
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::e09ad28a5e1b4d5cb2039832f3a27c03
http://arxiv.org/abs/2104.08726
http://arxiv.org/abs/2104.08726
Autor:
Elisabeth Mager-Hois, Gustavo A. Giménez-Lugo, Luis Chiruzzo, John Ortega, Alexis Palmer, Manuel Mager, Ngoc Thang Vu, Annette Rios, Arturo Oncevay, Angela Fan, Ximena Gutierrez-Vasques, Ivan Meza, Rolando Coto-Solano, Vishrav Chaudhary, Abteen Ebrahimi, Ricardo Argenton Ramos, Katharina Kann, Graham Neubig
Publikováno v:
Proceedings of the First Workshop on Natural Language Processing for Indigenous Languages of the Americas.
This paper presents the results of the 2021 Shared Task on Open Machine Translation for Indigenous Languages of the Americas. The shared task featured two independent tracks, and participants submitted machine translation systems for up to 10 indigen
Publikováno v:
ACL (1)
Neural natural language generation (NNLG) from structured meaning representations has become increasingly popular in recent years. While we have seen progress with generating syntactically correct utterances that preserve semantics, various shortcomi
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::777ba336625b9e16de3a431a694b9ef0