Zobrazeno 1 - 4
of 4
pro vyhledávání: '"de Souza, Leandro Rodrigues"'
Multilingual pretraining has been a successful solution to the challenges posed by the lack of resources for languages. These models can transfer knowledge to target languages with minimal or no examples. Recent research suggests that monolingual mod
Externí odkaz:
http://arxiv.org/abs/2404.08191
The zero-shot cross-lingual ability of models pretrained on multilingual and even monolingual corpora has spurred many hypotheses to explain this intriguing empirical result. However, due to the costs of pretraining, most research uses public models
Externí odkaz:
http://arxiv.org/abs/2209.11035
Pretrained multilingual models have become a de facto default approach for zero-shot cross-lingual transfer. Previous work has shown that these models are able to achieve cross-lingual representations when pretrained on two or more languages with sha
Externí odkaz:
http://arxiv.org/abs/2109.01942
Autor:
Rosa, Guilherme Moraes, Bonifacio, Luiz Henrique, de Souza, Leandro Rodrigues, Lotufo, Roberto, Nogueira, Rodrigo
An effective method for cross-lingual transfer is to fine-tune a bilingual or multilingual model on a supervised dataset in one language and evaluating it on another language in a zero-shot manner. Translating examples at training time or inference t
Externí odkaz:
http://arxiv.org/abs/2105.06813