Multi-Domain Neural Machine Translation

Sander Tars, Mark Fishel


Abstract
We present an approach to neural machine translation (NMT) that supports multiple domains in a single model and allows switching between the domains when translating. The core idea is to treat text domainsasdistinctlanguagesandusemultilingual NMT methods to create multi-domain translation systems; we show that this approach results in significant translation quality gains over fine-tuning. We also explore whether the knowledge of pre-specified text domains is necessary; turns out that it is after all, but also that when it is not known quite high translation quality can be reached, and even higher than with known domains in some cases.
Anthology ID:
2018.eamt-main.26
Volume:
Proceedings of the 21st Annual Conference of the European Association for Machine Translation
Month:
May
Year:
2018
Address:
Alicante, Spain
Editors:
Juan Antonio Pérez-Ortiz, Felipe Sánchez-Martínez, Miquel Esplà-Gomis, Maja Popović, Celia Rico, André Martins, Joachim Van den Bogaert, Mikel L. Forcada
Venue:
EAMT
SIG:
Publisher:
Note:
Pages:
279–288
Language:
URL:
https://aclanthology.org/2018.eamt-main.26
DOI:
Bibkey:
Cite (ACL):
Sander Tars and Mark Fishel. 2018. Multi-Domain Neural Machine Translation. In Proceedings of the 21st Annual Conference of the European Association for Machine Translation, pages 279–288, Alicante, Spain.
Cite (Informal):
Multi-Domain Neural Machine Translation (Tars & Fishel, EAMT 2018)
Copy Citation:
PDF:
https://aclanthology.org/2018.eamt-main.26.pdf