m^4 Adapter: Multilingual Multi-Domain Adaptation for Machine Translation with a Meta-Adapter

Wen Lai, Alexandra Chronopoulou, Alexander Fraser


Abstract
Multilingual neural machine translation models (MNMT) yield state-of-the-art performance when evaluated on data from a domain and language pair seen at training time. However, when a MNMT model is used to translate under domain shift or to a new language pair, performance drops dramatically. We consider a very challenging scenario: adapting the MNMT model both to a new domain and to a new language pair at the same time. In this paper, we propose m^4Adapter (Multilingual Multi-Domain Adaptation for Machine Translation with a Meta-Adapter), which combines domain and language knowledge using meta-learning with adapters. We present results showing that our approach is a parameter-efficient solution which effectively adapts a model to both a new language pair and a new domain, while outperforming other adapter methods. An ablation study also shows that our approach more effectively transfers domain knowledge across different languages and language information across different domains.
Anthology ID:
2022.findings-emnlp.315
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4282–4296
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.315
DOI:
10.18653/v1/2022.findings-emnlp.315
Bibkey:
Cite (ACL):
Wen Lai, Alexandra Chronopoulou, and Alexander Fraser. 2022. m^4 Adapter: Multilingual Multi-Domain Adaptation for Machine Translation with a Meta-Adapter. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 4282–4296, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
m^4 Adapter: Multilingual Multi-Domain Adaptation for Machine Translation with a Meta-Adapter (Lai et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.315.pdf
Video:
 https://aclanthology.org/2022.findings-emnlp.315.mp4