Continual Learning in Multilingual NMT via Language-Specific Embeddings

Alexandre Berard


Abstract
This paper proposes a technique for adding a new source or target language to an existing multilingual NMT model without re-training it on the initial set of languages. It consists in replacing the shared vocabulary with a small language-specific vocabulary and fine-tuning the new embeddings on the new language’s parallel data. Some additional language-specific components may be trained to improve performance (e.g., Transformer layers or adapter modules). Because the parameters of the original model are not modified, its performance on the initial languages does not degrade. We show on two sets of experiments (small-scale on TED Talks, and large-scale on ParaCrawl) that this approach performs as well or better as the more costly alternatives; and that it has excellent zero-shot performance: training on English-centric data is enough to translate between the new language and any of the initial languages.
Anthology ID:
2021.wmt-1.62
Volume:
Proceedings of the Sixth Conference on Machine Translation
Month:
November
Year:
2021
Address:
Online
Editors:
Loic Barrault, Ondrej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussa, Christian Federmann, Mark Fishel, Alexander Fraser, Markus Freitag, Yvette Graham, Roman Grundkiewicz, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, Tom Kocmi, Andre Martins, Makoto Morishita, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
542–565
Language:
URL:
https://aclanthology.org/2021.wmt-1.62
DOI:
Bibkey:
Cite (ACL):
Alexandre Berard. 2021. Continual Learning in Multilingual NMT via Language-Specific Embeddings. In Proceedings of the Sixth Conference on Machine Translation, pages 542–565, Online. Association for Computational Linguistics.
Cite (Informal):
Continual Learning in Multilingual NMT via Language-Specific Embeddings (Berard, WMT 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.wmt-1.62.pdf
Video:
 https://aclanthology.org/2021.wmt-1.62.mp4
Data
ParaCrawl