Tricks for Training Sparse Translation Models

Dheeru Dua, Shruti Bhosale, Vedanuj Goswami, James Cross, Mike Lewis, Angela Fan


Abstract
Multi-task learning with an unbalanced data distribution skews model learning towards high resource tasks, especially when model capacity is fixed and fully shared across all tasks. Sparse scaling architectures, such as BASELayers, provide flexible mechanisms for different tasks to have a variable number of parameters, which can be useful to counterbalance skewed data distributions. We find that that sparse architectures for multilingual machine translation can perform poorly out of the box and propose two straightforward techniques to mitigate this — a temperature heating mechanism and dense pre-training. Overall, these methods improve performance on two multilingual translation benchmarks compared to standard BASELayers and Dense scaling baselines, and in combination, more than 2x model convergence speed.
Anthology ID:
2022.naacl-main.244
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3340–3345
Language:
URL:
https://aclanthology.org/2022.naacl-main.244
DOI:
10.18653/v1/2022.naacl-main.244
Bibkey:
Cite (ACL):
Dheeru Dua, Shruti Bhosale, Vedanuj Goswami, James Cross, Mike Lewis, and Angela Fan. 2022. Tricks for Training Sparse Translation Models. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 3340–3345, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Tricks for Training Sparse Translation Models (Dua et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.244.pdf
Video:
 https://aclanthology.org/2022.naacl-main.244.mp4