A Study of Residual Adapters for Multi-Domain Neural Machine Translation

Minh Quang Pham, Josep Maria Crego, François Yvon, Jean Senellart


Abstract
Domain adaptation is an old and vexing problem for machine translation systems. The most common approach and successful to supervised adaptation is to fine-tune a baseline system with in-domain parallel data. Standard fine-tuning however modifies all the network parameters, which makes this approach computationally costly and prone to overfitting. A recent, lightweight approach, instead augments a baseline model with supplementary (small) adapter layers, keeping the rest of the mode unchanged. This has the additional merit to leave the baseline model intact, and adaptable to multiple domains. In this paper, we conduct a thorough analysis of the adapter model in the context of a multidomain machine translation task. We contrast multiple implementations of this idea on two language pairs. Our main conclusions are that residual adapters provide a fast and cheap method for supervised multi-domain adaptation; our two variants prove as effective as the original adapter model, and open perspective to also make adapted models more robust to label domain errors.
Anthology ID:
2020.wmt-1.72
Volume:
Proceedings of the Fifth Conference on Machine Translation
Month:
November
Year:
2020
Address:
Online
Editors:
Loïc Barrault, Ondřej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussà, Christian Federmann, Mark Fishel, Alexander Fraser, Yvette Graham, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, André Martins, Makoto Morishita, Christof Monz, Masaaki Nagata, Toshiaki Nakazawa, Matteo Negri
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
617–628
Language:
URL:
https://aclanthology.org/2020.wmt-1.72
DOI:
Bibkey:
Cite (ACL):
Minh Quang Pham, Josep Maria Crego, François Yvon, and Jean Senellart. 2020. A Study of Residual Adapters for Multi-Domain Neural Machine Translation. In Proceedings of the Fifth Conference on Machine Translation, pages 617–628, Online. Association for Computational Linguistics.
Cite (Informal):
A Study of Residual Adapters for Multi-Domain Neural Machine Translation (Pham et al., WMT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.wmt-1.72.pdf
Video:
 https://slideslive.com/38939655
Code
 qmpham/experiments