Multilingual Neural Machine Translation

Raj Dabre, Chenhui Chu, Anoop Kunchukuttan


Abstract
The advent of neural machine translation (NMT) has opened up exciting research in building multilingual translation systems i.e. translation models that can handle more than one language pair. Many advances have been made which have enabled (1) improving translation for low-resource languages via transfer learning from high resource languages; and (2) building compact translation models spanning multiple languages. In this tutorial, we will cover the latest advances in NMT approaches that leverage multilingualism, especially to enhance low-resource translation. In particular, we will focus on the following topics: modeling parameter sharing for multi-way models, massively multilingual models, training protocols, language divergence, transfer learning, zero-shot/zero-resource learning, pivoting, multilingual pre-training and multi-source translation.
Anthology ID:
2020.coling-tutorials.3
Volume:
Proceedings of the 28th International Conference on Computational Linguistics: Tutorial Abstracts
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Lucia Specia, Daniel Beck
Venue:
COLING
SIG:
Publisher:
International Committee for Computational Linguistics
Note:
Pages:
16–21
Language:
URL:
https://aclanthology.org/2020.coling-tutorials.3
DOI:
10.18653/v1/2020.coling-tutorials.3
Bibkey:
Cite (ACL):
Raj Dabre, Chenhui Chu, and Anoop Kunchukuttan. 2020. Multilingual Neural Machine Translation. In Proceedings of the 28th International Conference on Computational Linguistics: Tutorial Abstracts, pages 16–21, Barcelona, Spain (Online). International Committee for Computational Linguistics.
Cite (Informal):
Multilingual Neural Machine Translation (Dabre et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-tutorials.3.pdf