Zobrazeno 1 - 10
of 810
pro vyhledávání: '"Downey C"'
The "massively-multilingual" training of multilingual models is known to limit their utility in any one language, and they perform particularly poorly on low-resource languages. However, there is evidence that low-resource languages can benefit from
Externí odkaz:
http://arxiv.org/abs/2405.12413
Publikováno v:
EPJ Web of Conferences, Vol 288, p 04006 (2023)
Real-time characterization of irradiation facilities improves the utilization of the core capabilities of test nuclear reactors. The ability to observe how the local neutron flux (level and spectrum) changes as control elements and experiments change
Externí odkaz:
https://doaj.org/article/ef05d65cdfe44f2797a4950aeb2fdc1f
Publikováno v:
Clinical, Cosmetic and Investigational Dermatology, Vol Volume 12, Pp 373-381 (2019)
Rodrigo Meza-Romero,1 Cristián Navarrete-Dechent,1,2 Camila Downey11Department of Dermatology, Facultad de Medicina, Pontificia Universidad Católica de Chile, Santiago, Chile; 2Dermatology Service, Memorial Sloan Kettering Cancer Center, New York,
Externí odkaz:
https://doaj.org/article/8a958742da2b42548c8f8babba2d6033
Pre-trained multilingual language models underpin a large portion of modern NLP tools outside of English. A strong baseline for specializing these models for specific languages is Language-Adaptive Pre-Training (LAPT). However, retaining a large cros
Externí odkaz:
http://arxiv.org/abs/2309.04679
Language models are widely deployed to provide automatic text completion services in user products. However, recent research has revealed that language models (especially large ones) bear considerable risk of memorizing private training data, which i
Externí odkaz:
http://arxiv.org/abs/2212.08619
We formulate and test a technique to use Emergent Communication (EC) with a pre-trained multilingual model to improve on modern Unsupervised NMT systems, especially for low-resource languages. It has been argued that the current dominant paradigm in
Externí odkaz:
http://arxiv.org/abs/2207.07025
We show that unsupervised sequence-segmentation performance can be transferred to extremely low-resource languages by pre-training a Masked Segmental Language Model (Downey et al., 2021) multilingually. Further, we show that this transfer can be achi
Externí odkaz:
http://arxiv.org/abs/2110.08415
Segmentation remains an important preprocessing step both in languages where "words" or other important syntactic/semantic units (like morphemes) are not clearly delineated by white space, as well as when dealing with continuous speech data, where th
Externí odkaz:
http://arxiv.org/abs/2104.07829
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Autor:
Posthuma, L.M., Downey, C., Visscher, M.J., Ghazali, D.A., Joshi, M., Ashrafian, H., Khan, S., Darzi, A., Goldstone, J., Preckel, B.
Publikováno v:
In International Journal of Nursing Studies April 2020 104