Revisiting Neural Language Modelling with Syllables

Autor: Oncevay, Arturo, Rojas, Kervy Rivas
Rok vydání: 2020
Předmět:
Druh dokumentu: Working Paper
Popis: Language modelling is regularly analysed at word, subword or character units, but syllables are seldom used. Syllables provide shorter sequences than characters, they can be extracted with rules, and their segmentation typically requires less specialised effort than identifying morphemes. We reconsider syllables for an open-vocabulary generation task in 20 languages. We use rule-based syllabification methods for five languages and address the rest with a hyphenation tool, which behaviour as syllable proxy is validated. With a comparable perplexity, we show that syllables outperform characters, annotated morphemes and unsupervised subwords. Finally, we also study the overlapping of syllables concerning other subword pieces and discuss some limitations and opportunities.
Comment: 5 pages (main paper), 4 pages of Appendix
Databáze: arXiv