Zobrazeno 1 - 1
of 1
pro vyhledávání: '"Michal Seják"'
Publikováno v:
RANLP
This paper describes the training process of the first Czech monolingual language representation models based on BERT and ALBERT architectures. We pre-train our models on more than 340K of sentences, which is 50 times more than multilingual models th