An Empirical Comparison of Vocabulary Expansion and Initialization Approaches For Language Models

Nandini Mundra, Aditya Nanda Kishore Khandavally, Raj Dabre, Ratish Puduppully, Anoop Kunchukuttan, Mitesh M Khapra


Abstract
Language Models (LMs) excel in natural language processing tasks for English but show reduced performance in most other languages. This problem is commonly tackled by continually pre-training and fine-tuning these models for said languages. A significant issue in this process is the limited vocabulary coverage in the original model’s tokenizer, leading to inadequate representation of new languages and necessitating an expansion of the tokenizer. The initialization of the embeddings corresponding to new vocabulary items presents a further challenge. Current strategies require cross-lingual embeddings and lack a solid theoretical foundation as well as comparisons with strong baselines. In this paper, we first establish theoretically that initializing within the convex hull of existing embeddings is a good initialization, followed by a novel but simple approach, Constrained Word2Vec (CW2V), which does not require cross-lingual embeddings. Our study evaluates different initialization methods for expanding RoBERTa and LLaMA 2 across four languages and five tasks. The results show that CW2V performs equally well or even better than more advanced techniques. Additionally, simpler approaches like multivariate initialization perform on par with these advanced methods indicating that efficient large-scale multilingual continued pretraining can be achieved even with simpler initialization methods.
Anthology ID:
2024.conll-1.8
Volume:
Proceedings of the 28th Conference on Computational Natural Language Learning
Month:
November
Year:
2024
Address:
Miami, FL, USA
Editors:
Libby Barak, Malihe Alikhani
Venue:
CoNLL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
84–104
Language:
URL:
https://aclanthology.org/2024.conll-1.8
DOI:
Bibkey:
Cite (ACL):
Nandini Mundra, Aditya Nanda Kishore Khandavally, Raj Dabre, Ratish Puduppully, Anoop Kunchukuttan, and Mitesh M Khapra. 2024. An Empirical Comparison of Vocabulary Expansion and Initialization Approaches For Language Models. In Proceedings of the 28th Conference on Computational Natural Language Learning, pages 84–104, Miami, FL, USA. Association for Computational Linguistics.
Cite (Informal):
An Empirical Comparison of Vocabulary Expansion and Initialization Approaches For Language Models (Mundra et al., CoNLL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.conll-1.8.pdf