Using Multiple Subwords to Improve English-Esperanto Automated Literary Translation Quality

Alberto Poncelas, Jan Buts, James Hadley, Andy Way


Abstract
Building Machine Translation (MT) systems for low-resource languages remains challenging. For many language pairs, parallel data are not widely available, and in such cases MT models do not achieve results comparable to those seen with high-resource languages. When data are scarce, it is of paramount importance to make optimal use of the limited material available. To that end, in this paper we propose employing the same parallel sentences multiple times, only changing the way the words are split each time. For this purpose we use several Byte Pair Encoding models, with various merge operations used in their configuration. In our experiments, we use this technique to expand the available data and improve an MT system involving a low-resource language pair, namely English-Esperanto. As an additional contribution, we made available a set of English-Esperanto parallel data in the literary domain.
Anthology ID:
2020.loresmt-1.14
Volume:
Proceedings of the 3rd Workshop on Technologies for MT of Low Resource Languages
Month:
December
Year:
2020
Address:
Suzhou, China
Editors:
Alina Karakanta, Atul Kr. Ojha, Chao-Hong Liu, Jade Abbott, John Ortega, Jonathan Washington, Nathaniel Oco, Surafel Melaku Lakew, Tommi A Pirinen, Valentin Malykh, Varvara Logacheva, Xiaobing Zhao
Venue:
LoResMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
108–117
Language:
URL:
https://aclanthology.org/2020.loresmt-1.14
DOI:
Bibkey:
Cite (ACL):
Alberto Poncelas, Jan Buts, James Hadley, and Andy Way. 2020. Using Multiple Subwords to Improve English-Esperanto Automated Literary Translation Quality. In Proceedings of the 3rd Workshop on Technologies for MT of Low Resource Languages, pages 108–117, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Using Multiple Subwords to Improve English-Esperanto Automated Literary Translation Quality (Poncelas et al., LoResMT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.loresmt-1.14.pdf