Data Cartography for Low-Resource Neural Machine Translation

Aquia Richburg, Marine Carpuat


Abstract
While collecting or generating more parallel data is necessary to improve machine translation (MT) in low-resource settings, we lack an understanding of how the limited amounts of existing data are actually used to help guide the collection of further resources. In this paper, we apply data cartography techniques (Swayamdipta et al., 2020) to characterize the contribution of training samples in two low-resource MT tasks (Swahili-English and Turkish-English) throughout the training of standard neural MT models. Our empirical study shows that, unlike in prior work for classification tasks, most samples contribute to model training in low-resource MT, albeit not uniformly throughout the training process. Furthermore, uni-dimensional characterizations of samples – e.g., based on dual cross-entropy or word frequency – do not suffice to characterize to what degree they are hard or easy to learn. Taken together, our results suggest that data augmentation strategies for low-resource MT would benefit from model-in-the-loop strategies to maximize improvements.
Anthology ID:
2022.findings-emnlp.410
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5594–5607
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.410
DOI:
10.18653/v1/2022.findings-emnlp.410
Bibkey:
Cite (ACL):
Aquia Richburg and Marine Carpuat. 2022. Data Cartography for Low-Resource Neural Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 5594–5607, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Data Cartography for Low-Resource Neural Machine Translation (Richburg & Carpuat, Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.410.pdf
Video:
 https://aclanthology.org/2022.findings-emnlp.410.mp4