A Preordered RNN Layer Boosts Neural Machine Translation in Low Resource Settings

Mohaddeseh Bastan, Shahram Khadivi


Abstract
Neural Machine Translation (NMT) models are strong enough to convey semantic and syntactic information from the source language to the target language. However, these models are suffering from the need for a large amount of data to learn the parameters. As a result, for languages with scarce data, these models are at risk of underperforming. We propose to augment attention based neural network with reordering information to alleviate the lack of data. This augmentation improves the translation quality for both English to Persian and Persian to English by up to 6% BLEU absolute over the baseline models.
Anthology ID:
2022.loresmt-1.12
Volume:
Proceedings of the Fifth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2022)
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Atul Kr. Ojha, Chao-Hong Liu, Ekaterina Vylomova, Jade Abbott, Jonathan Washington, Nathaniel Oco, Tommi A Pirinen, Valentin Malykh, Varvara Logacheva, Xiaobing Zhao
Venue:
LoResMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
93–98
Language:
URL:
https://aclanthology.org/2022.loresmt-1.12
DOI:
Bibkey:
Cite (ACL):
Mohaddeseh Bastan and Shahram Khadivi. 2022. A Preordered RNN Layer Boosts Neural Machine Translation in Low Resource Settings. In Proceedings of the Fifth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2022), pages 93–98, Gyeongju, Republic of Korea. Association for Computational Linguistics.
Cite (Informal):
A Preordered RNN Layer Boosts Neural Machine Translation in Low Resource Settings (Bastan & Khadivi, LoResMT 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.loresmt-1.12.pdf