Addressing the Vulnerability of NMT in Input Perturbations

Weiwen Xu, Ai Ti Aw, Yang Ding, Kui Wu, Shafiq Joty


Abstract
Neural Machine Translation (NMT) has achieved significant breakthrough in performance but is known to suffer vulnerability to input perturbations. As real input noise is difficult to predict during training, robustness is a big issue for system deployment. In this paper, we improve the robustness of NMT models by reducing the effect of noisy words through a Context-Enhanced Reconstruction (CER) approach. CER trains the model to resist noise in two steps: (1) perturbation step that breaks the naturalness of input sequence with made-up words; (2) reconstruction step that defends the noise propagation by generating better and more robust contextual representation. Experimental results on Chinese-English (ZH-EN) and French-English (FR-EN) translation tasks demonstrate robustness improvement on both news and social media text. Further fine-tuning experiments on social media text show our approach can converge at a higher position and provide a better adaptation.
Anthology ID:
2021.naacl-industry.11
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Papers
Month:
June
Year:
2021
Address:
Online
Editors:
Young-bum Kim, Yunyao Li, Owen Rambow
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
80–88
Language:
URL:
https://aclanthology.org/2021.naacl-industry.11
DOI:
10.18653/v1/2021.naacl-industry.11
Bibkey:
Cite (ACL):
Weiwen Xu, Ai Ti Aw, Yang Ding, Kui Wu, and Shafiq Joty. 2021. Addressing the Vulnerability of NMT in Input Perturbations. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Papers, pages 80–88, Online. Association for Computational Linguistics.
Cite (Informal):
Addressing the Vulnerability of NMT in Input Perturbations (Xu et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-industry.11.pdf
Video:
 https://aclanthology.org/2021.naacl-industry.11.mp4
Code
 wwxu21/CER-MT
Data
MTNT