Improving Neural Machine Translation by Bidirectional Training

Liang Ding, Di Wu, Dacheng Tao


Abstract
We present a simple and effective pretraining strategy – bidirectional training (BiT) for neural machine translation. Specifically, we bidirectionally update the model parameters at the early stage and then tune the model normally. To achieve bidirectional updating, we simply reconstruct the training samples from “srctgt” to “src+tgttgt+src” without any complicated model modifications. Notably, our approach does not increase any parameters or training steps, requiring the parallel data merely. Experimental results show that BiT pushes the SOTA neural machine translation performance across 15 translation tasks on 8 language pairs (data sizes range from 160K to 38M) significantly higher. Encouragingly, our proposed model can complement existing data manipulation strategies, i.e. back translation, data distillation, and data diversification. Extensive analyses show that our approach functions as a novel bilingual code-switcher, obtaining better bilingual alignment.
Anthology ID:
2021.emnlp-main.263
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3278–3284
Language:
URL:
https://aclanthology.org/2021.emnlp-main.263
DOI:
10.18653/v1/2021.emnlp-main.263
Bibkey:
Cite (ACL):
Liang Ding, Di Wu, and Dacheng Tao. 2021. Improving Neural Machine Translation by Bidirectional Training. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 3278–3284, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Improving Neural Machine Translation by Bidirectional Training (Ding et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.263.pdf