Distributionally Robust Multilingual Machine Translation

Chunting Zhou, Daniel Levy, Xian Li, Marjan Ghazvininejad, Graham Neubig


Abstract
Multilingual neural machine translation (MNMT) learns to translate multiple language pairs with a single model, potentially improving both the accuracy and the memory-efficiency of deployed models. However, the heavy data imbalance between languages hinders the model from performing uniformly across language pairs. In this paper, we propose a new learning objective for MNMT based on distributionally robust optimization, which minimizes the worst-case expected loss over the set of language pairs. We further show how to practically optimize this objective for large translation corpora using an iterated best response scheme, which is both effective and incurs negligible additional computational cost compared to standard empirical risk minimization. We perform extensive experiments on three sets of languages from two datasets and show that our method consistently outperforms strong baseline methods in terms of average and per-language performance under both many-to-one and one-to-many translation settings.
Anthology ID:
2021.emnlp-main.458
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5664–5674
Language:
URL:
https://aclanthology.org/2021.emnlp-main.458
DOI:
10.18653/v1/2021.emnlp-main.458
Bibkey:
Cite (ACL):
Chunting Zhou, Daniel Levy, Xian Li, Marjan Ghazvininejad, and Graham Neubig. 2021. Distributionally Robust Multilingual Machine Translation. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 5664–5674, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Distributionally Robust Multilingual Machine Translation (Zhou et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.458.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.458.mp4
Code
 violet-zct/fairseq-dro-mnmt