Efficiently Upgrading Multilingual Machine Translation Models to Support More Languages

Simeng Sun, Maha Elbayad, Anna Sun, James Cross


Abstract
With multilingual machine translation (MMT) models continuing to grow in size and number of supported languages, it is natural to reuse and upgrade existing models to save computation as data becomes available in more languages. However, adding new languages requires updating the vocabulary, which complicates the reuse of embeddings. The question of how to reuse existing models while also making architectural changes to provide capacity for both old and new languages has also not been closely studied. In this work, we introduce three techniques that help speed up the effective learning of new languages and alleviate catastrophic forgetting despite vocabulary and architecture mismatches. Our results show that by (1) carefully initializing the network, (2) applying learning rate scaling, and (3) performing data up-sampling, it is possible to exceed the performance of a same-sized baseline model with 30% computation and recover the performance of a larger model trained from scratch with over 50% reduction in computation. Furthermore, our analysis reveals that the introduced techniques help learn new directions more effectively and alleviate catastrophic forgetting at the same time. We hope our work will guide research into more efficient approaches to growing languages for these MMT models and ultimately maximize the reuse of existing models.
Anthology ID:
2023.eacl-main.111
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1513–1527
Language:
URL:
https://aclanthology.org/2023.eacl-main.111
DOI:
10.18653/v1/2023.eacl-main.111
Bibkey:
Cite (ACL):
Simeng Sun, Maha Elbayad, Anna Sun, and James Cross. 2023. Efficiently Upgrading Multilingual Machine Translation Models to Support More Languages. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 1513–1527, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Efficiently Upgrading Multilingual Machine Translation Models to Support More Languages (Sun et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.111.pdf
Video:
 https://aclanthology.org/2023.eacl-main.111.mp4