Bootstrapping Embeddings for Low Resource Languages

Merve Basoz, Andrew Horne, Mattia Opper


Abstract
Embedding models are a crucial to modern NLP. However, the creation of the most effective models relies on carefully constructed supervised finetuning data. For high resource languages, such as English, such datasets are readily available. However, for hundreds of other languages, they are simply non-existent. We investigate whether the advent of large language models can help to bridge this gap. We test three different strategies for generating synthetic triplet data used to optimising embedding models. These include in-context learning as well as two novel approaches, leveraging adapter composition and cross lingual finetuning of the LLM generator (XL-LoRA) respectively. We find that while in-context learning still falls short of strong non-synthetic baselines, adapter composition and XL-LoRA yield strong performance gains across a wide array of tasks and languages, offering a clear scalable pathway to producing performant embedding models for a wide variety of languages.
Anthology ID:
2026.loreslm-1.35
Volume:
Proceedings of the Second Workshop on Language Models for Low-Resource Languages (LoResLM 2026)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Hansi Hettiarachchi, Tharindu Ranasinghe, Alistair Plum, Paul Rayson, Ruslan Mitkov, Mohamed Gaber, Damith Premasiri, Fiona Anting Tan, Lasitha Uyangodage
Venue:
LoResLM
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
408–425
Language:
URL:
https://aclanthology.org/2026.loreslm-1.35/
DOI:
Bibkey:
Cite (ACL):
Merve Basoz, Andrew Horne, and Mattia Opper. 2026. Bootstrapping Embeddings for Low Resource Languages. In Proceedings of the Second Workshop on Language Models for Low-Resource Languages (LoResLM 2026), pages 408–425, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Bootstrapping Embeddings for Low Resource Languages (Basoz et al., LoResLM 2026)
Copy Citation:
PDF:
https://aclanthology.org/2026.loreslm-1.35.pdf