GradSim: Gradient-Based Language Grouping for Effective Multilingual Training

Mingyang Wang, Heike Adel, Lukas Lange, Jannik Strötgen, Hinrich Schuetze


Abstract
Most languages of the world pose low-resource challenges to natural language processing models. With multilingual training, knowledge can be shared among languages. However, not all languages positively influence each other and it is an open research question how to select the most suitable set of languages for multilingual training and avoid negative interference among languages whose characteristics or data distributions are not compatible. In this paper, we propose GradSim, a language grouping method based on gradient similarity. Our experiments on three diverse multilingual benchmark datasets show that it leads to the largest performance gains compared to other similarity measures and it is better correlated with cross-lingual model performance. As a result, we set the new state of the art on AfriSenti, a benchmark dataset for sentiment analysis on low-resource African languages. In our extensive analysis, we further reveal that besides linguistic features, the topics of the datasets play an important role for language grouping and that lower layers of transformer models encode language-specific features while higher layers capture task-specific information.
Anthology ID:
2023.emnlp-main.282
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4631–4646
Language:
URL:
https://aclanthology.org/2023.emnlp-main.282
DOI:
10.18653/v1/2023.emnlp-main.282
Bibkey:
Cite (ACL):
Mingyang Wang, Heike Adel, Lukas Lange, Jannik Strötgen, and Hinrich Schuetze. 2023. GradSim: Gradient-Based Language Grouping for Effective Multilingual Training. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 4631–4646, Singapore. Association for Computational Linguistics.
Cite (Informal):
GradSim: Gradient-Based Language Grouping for Effective Multilingual Training (Wang et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.282.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.282.mp4