AdvAug: Robust Adversarial Augmentation for Neural Machine Translation

Yong Cheng, Lu Jiang, Wolfgang Macherey, Jacob Eisenstein


Abstract
In this paper, we propose a new adversarial augmentation method for Neural Machine Translation (NMT). The main idea is to minimize the vicinal risk over virtual sentences sampled from two vicinity distributions, in which the crucial one is a novel vicinity distribution for adversarial sentences that describes a smooth interpolated embedding space centered around observed training sentence pairs. We then discuss our approach, AdvAug, to train NMT models using the embeddings of virtual sentences in sequence-to-sequence learning. Experiments on Chinese-English, English-French, and English-German translation benchmarks show that AdvAug achieves significant improvements over theTransformer (up to 4.9 BLEU points), and substantially outperforms other data augmentation techniques (e.g.back-translation) without using extra corpora.
Anthology ID:
2020.acl-main.529
Original:
2020.acl-main.529v1
Version 2:
2020.acl-main.529v2
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5961–5970
Language:
URL:
https://aclanthology.org/2020.acl-main.529
DOI:
10.18653/v1/2020.acl-main.529
Bibkey:
Cite (ACL):
Yong Cheng, Lu Jiang, Wolfgang Macherey, and Jacob Eisenstein. 2020. AdvAug: Robust Adversarial Augmentation for Neural Machine Translation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 5961–5970, Online. Association for Computational Linguistics.
Cite (Informal):
AdvAug: Robust Adversarial Augmentation for Neural Machine Translation (Cheng et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.529.pdf
Video:
 http://slideslive.com/38929269
Data
WMT 2014