%0 Conference Proceedings %T Factorized Transformer for Multi-Domain Neural Machine Translation %A Deng, Yongchao %A Yu, Hongfei %A Yu, Heng %A Duan, Xiangyu %A Luo, Weihua %Y Cohn, Trevor %Y He, Yulan %Y Liu, Yang %S Findings of the Association for Computational Linguistics: EMNLP 2020 %D 2020 %8 November %I Association for Computational Linguistics %C Online %F deng-etal-2020-factorized %X Multi-Domain Neural Machine Translation (NMT) aims at building a single system that performs well on a range of target domains. However, along with the extreme diversity of cross-domain wording and phrasing style, the imperfections of training data distribution and the inherent defects of the current sequential learning process all contribute to making the task of multi-domain NMT very challenging. To mitigate these problems, we propose the Factorized Transformer, which consists of an in-depth factorization of the parameters of an NMT model, namely Transformer in this paper, into two categories: domain-shared ones that encode common cross-domain knowledge and domain-specific ones that are private for each constituent domain. We experiment with various designs of our model and conduct extensive validations on English to French open multi-domain dataset. Our approach achieves state-of-the-art performance and opens up new perspectives for multi-domain and open-domain applications. %R 10.18653/v1/2020.findings-emnlp.377 %U https://aclanthology.org/2020.findings-emnlp.377 %U https://doi.org/10.18653/v1/2020.findings-emnlp.377 %P 4221-4230