%0 Conference Proceedings %T Improving the Quality Trade-Off for Neural Machine Translation Multi-Domain Adaptation %A Hasler, Eva %A Domhan, Tobias %A Trenous, Jonay %A Tran, Ke %A Byrne, Bill %A Hieber, Felix %Y Moens, Marie-Francine %Y Huang, Xuanjing %Y Specia, Lucia %Y Yih, Scott Wen-tau %S Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing %D 2021 %8 November %I Association for Computational Linguistics %C Online and Punta Cana, Dominican Republic %F hasler-etal-2021-improving %X Building neural machine translation systems to perform well on a specific target domain is a well-studied problem. Optimizing system performance for multiple, diverse target domains however remains a challenge. We study this problem in an adaptation setting where the goal is to preserve the existing system quality while incorporating data for domains that were not the focus of the original translation system. We find that we can improve over the performance trade-off offered by Elastic Weight Consolidation with a relatively simple data mixing strategy. At comparable performance on the new domains, catastrophic forgetting is mitigated significantly on strong WMT baselines. Combining both approaches improves the Pareto frontier on this task. %R 10.18653/v1/2021.emnlp-main.666 %U https://aclanthology.org/2021.emnlp-main.666 %U https://doi.org/10.18653/v1/2021.emnlp-main.666 %P 8470-8477