%0 Conference Proceedings %T Domain Generalisation of NMT: Fusing Adapters with Leave-One-Domain-Out Training %A Vu, Thuy-Trang %A Khadivi, Shahram %A Phung, Dinh %A Haffari, Gholamreza %Y Muresan, Smaranda %Y Nakov, Preslav %Y Villavicencio, Aline %S Findings of the Association for Computational Linguistics: ACL 2022 %D 2022 %8 May %I Association for Computational Linguistics %C Dublin, Ireland %F vu-etal-2022-domain %X Generalising to unseen domains is under-explored and remains a challenge in neural machine translation. Inspired by recent research in parameter-efficient transfer learning from pretrained models, this paper proposes a fusion-based generalisation method that learns to combine domain-specific parameters. We propose a leave-one-domain-out training strategy to avoid information leaking to address the challenge of not knowing the test domain during training time. Empirical results on three language pairs show that our proposed fusion method outperforms other baselines up to +0.8 BLEU score on average. %R 10.18653/v1/2022.findings-acl.49 %U https://aclanthology.org/2022.findings-acl.49 %U https://doi.org/10.18653/v1/2022.findings-acl.49 %P 582-588