%0 Conference Proceedings %T Improving Robustness of Neural Machine Translation with Multi-task Learning %A Zhou, Shuyan %A Zeng, Xiangkai %A Zhou, Yingqi %A Anastasopoulos, Antonios %A Neubig, Graham %Y Bojar, Ondřej %Y Chatterjee, Rajen %Y Federmann, Christian %Y Fishel, Mark %Y Graham, Yvette %Y Haddow, Barry %Y Huck, Matthias %Y Yepes, Antonio Jimeno %Y Koehn, Philipp %Y Martins, André %Y Monz, Christof %Y Negri, Matteo %Y Névéol, Aurélie %Y Neves, Mariana %Y Post, Matt %Y Turchi, Marco %Y Verspoor, Karin %S Proceedings of the Fourth Conference on Machine Translation (Volume 2: Shared Task Papers, Day 1) %D 2019 %8 August %I Association for Computational Linguistics %C Florence, Italy %F zhou-etal-2019-improving %X While neural machine translation (NMT) achieves remarkable performance on clean, in-domain text, performance is known to degrade drastically when facing text which is full of typos, grammatical errors and other varieties of noise. In this work, we propose a multi-task learning algorithm for transformer-based MT systems that is more resilient to this noise. We describe our submission to the WMT 2019 Robustness shared task based on this method. Our model achieves a BLEU score of 32.8 on the shared task French to English dataset, which is 7.1 BLEU points higher than the baseline vanilla transformer trained with clean text. %R 10.18653/v1/W19-5368 %U https://aclanthology.org/W19-5368 %U https://doi.org/10.18653/v1/W19-5368 %P 565-571