Pre-training Methods for Neural Machine Translation

Mingxuan Wang, Lei Li


Abstract
This tutorial provides a comprehensive guide to make the most of pre-training for neural machine translation. Firstly, we will briefly introduce the background of NMT, pre-training methodology, and point out the main challenges when applying pre-training for NMT. Then we will focus on analysing the role of pre-training in enhancing the performance of NMT, how to design a better pre-training model for executing specific NMT tasks and how to better integrate the pre-trained model into NMT system. In each part, we will provide examples, discuss training techniques and analyse what is transferred when applying pre-training.
Anthology ID:
2021.acl-tutorials.4
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing: Tutorial Abstracts
Month:
August
Year:
2021
Address:
Online
Editors:
David Chiang, Min Zhang
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
21–25
Language:
URL:
https://aclanthology.org/2021.acl-tutorials.4
DOI:
10.18653/v1/2021.acl-tutorials.4
Bibkey:
Cite (ACL):
Mingxuan Wang and Lei Li. 2021. Pre-training Methods for Neural Machine Translation. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing: Tutorial Abstracts, pages 21–25, Online. Association for Computational Linguistics.
Cite (Informal):
Pre-training Methods for Neural Machine Translation (Wang & Li, ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-tutorials.4.pdf