Bidirectional Neural Machine Translation (NMT) using Monolingual Data for Khasi-English Pair

Nongbri Lavinia, Moirangthem Gourashyam, Salam Samarendra, Nongmeikapam Kishorjit


Abstract
Due to a lack of parallel data, low-resource language machine translation has been unable to make the most of Neural Machine Translation. This paper investigates several approaches as to how low-resource Neural Machine Translation can be improved in a strictly low-resource setting, especially for bidirectional Khasi-English language pairs. The back-translation method is used to expand the parallel corpus using monolingual data. The work also experimented with subword tokenizers to improve the translation accuracy for new and rare words. Transformer, a cutting-edge NMT model, serves as the backbone of the bidirectional Khasi-English machine translation. The final Khasi-to-English and English-to-Khasi NMT models trained using both authentic and synthetic parallel corpora show an increase of 2.34 and 3.1 BLEU scores, respectively, when compared to the models trained using only authentic parallel dataset.
Anthology ID:
2023.icon-1.24
Volume:
Proceedings of the 20th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2023
Address:
Goa University, Goa, India
Editors:
D. Pawar Jyoti, Lalitha Devi Sobha
Venue:
ICON
SIG:
SIGLEX
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
318–325
Language:
URL:
https://aclanthology.org/2023.icon-1.24
DOI:
Bibkey:
Cite (ACL):
Nongbri Lavinia, Moirangthem Gourashyam, Salam Samarendra, and Nongmeikapam Kishorjit. 2023. Bidirectional Neural Machine Translation (NMT) using Monolingual Data for Khasi-English Pair. In Proceedings of the 20th International Conference on Natural Language Processing (ICON), pages 318–325, Goa University, Goa, India. NLP Association of India (NLPAI).
Cite (Informal):
Bidirectional Neural Machine Translation (NMT) using Monolingual Data for Khasi-English Pair (Lavinia et al., ICON 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.icon-1.24.pdf