Hindi to English: Transformer-Based Neural Machine Translation
Autor: | Gangar, Kavit, Ruparel, Hardik, Lele, Shreyas |
---|---|
Rok vydání: | 2023 |
Předmět: | |
Zdroj: | Springer International Conference on Communication, Computing and Electronics Systems. 2020 337-347 |
Druh dokumentu: | Working Paper |
DOI: | 10.1007/978-981-33-4909-4_25 |
Popis: | Machine Translation (MT) is one of the most prominent tasks in Natural Language Processing (NLP) which involves the automatic conversion of texts from one natural language to another while preserving its meaning and fluency. Although the research in machine translation has been going on since multiple decades, the newer approach of integrating deep learning techniques in natural language processing has led to significant improvements in the translation quality. In this paper, we have developed a Neural Machine Translation (NMT) system by training the Transformer model to translate texts from Indian Language Hindi to English. Hindi being a low resource language has made it difficult for neural networks to understand the language thereby leading to a slow growth in the development of neural machine translators. Thus, to address this gap, we implemented back-translation to augment the training data and for creating the vocabulary, we experimented with both word and subword level tokenization using Byte Pair Encoding (BPE) thereby ending up training the Transformer in 10 different configurations. This led us to achieve a state-of-the-art BLEU score of 24.53 on the test set of IIT Bombay English-Hindi Corpus in one of the configurations. Comment: 10 pages, 2 figures |
Databáze: | arXiv |
Externí odkaz: |