%0 Conference Proceedings %T Improving Document-Level Neural Machine Translation with Domain Adaptation %A Ul Haq, Sami %A Abdul Rauf, Sadaf %A Shoukat, Arslan %A Hira, Noor-e- %Y Birch, Alexandra %Y Finch, Andrew %Y Hayashi, Hiroaki %Y Heafield, Kenneth %Y Junczys-Dowmunt, Marcin %Y Konstas, Ioannis %Y Li, Xian %Y Neubig, Graham %Y Oda, Yusuke %S Proceedings of the Fourth Workshop on Neural Generation and Translation %D 2020 %8 July %I Association for Computational Linguistics %C Online %F ul-haq-etal-2020-improving %X Recent studies have shown that translation quality of NMT systems can be improved by providing document-level contextual information. In general sentence-based NMT models are extended to capture contextual information from large-scale document-level corpora which are difficult to acquire. Domain adaptation on the other hand promises adapting components of already developed systems by exploiting limited in-domain data. This paper presents FJWU’s system submission at WNGT, we specifically participated in Document level MT task for German-English translation. Our system is based on context-aware Transformer model developed on top of original NMT architecture by integrating contextual information using attention networks. Our experimental results show providing previous sentences as context significantly improves the BLEU score as compared to a strong NMT baseline. We also studied the impact of domain adaptation on document level translationand were able to improve results by adaptingthe systems according to the testing domain. %R 10.18653/v1/2020.ngt-1.27 %U https://aclanthology.org/2020.ngt-1.27 %U https://doi.org/10.18653/v1/2020.ngt-1.27 %P 225-231