Evaluating Pre-training Objectives for Low-Resource Translation into Morphologically Rich Languages

Prajit Dhar, Arianna Bisazza, Gertjan van Noord


Abstract
The scarcity of parallel data is a major limitation for Neural Machine Translation (NMT) systems, in particular for translation into morphologically rich languages (MRLs). An important way to overcome the lack of parallel data is to leverage target monolingual data, which is typically more abundant and easier to collect. We evaluate a number of techniques to achieve this, ranging from back-translation to random token masking, on the challenging task of translating English into four typologically diverse MRLs, under low-resource settings. Additionally, we introduce Inflection Pre-Training (or PT-Inflect), a novel pre-training objective whereby the NMT system is pre-trained on the task of re-inflecting lemmatized target sentences before being trained on standard source-to-target language translation. We conduct our evaluation on four typologically diverse target MRLs, and find that PT-Inflect surpasses NMT systems trained only on parallel data. While PT-Inflect is outperformed by back-translation overall, combining the two techniques leads to gains in some of the evaluated language pairs.
Anthology ID:
2022.lrec-1.527
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
4933–4943
Language:
URL:
https://aclanthology.org/2022.lrec-1.527
DOI:
Bibkey:
Cite (ACL):
Prajit Dhar, Arianna Bisazza, and Gertjan van Noord. 2022. Evaluating Pre-training Objectives for Low-Resource Translation into Morphologically Rich Languages. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 4933–4943, Marseille, France. European Language Resources Association.
Cite (Informal):
Evaluating Pre-training Objectives for Low-Resource Translation into Morphologically Rich Languages (Dhar et al., LREC 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.lrec-1.527.pdf
Data
WMT 2020