Adapting Open-Source Generative Large Language Models for Low-Resource Languages: A Case Study for Turkish

Cagri Toraman


Abstract
Despite advancements in English-dominant generative large language models, further development is needed for low-resource languages to enhance global accessibility. The primary methods for representing these languages are monolingual and multilingual pretraining. Monolingual pretraining is expensive due to hardware requirements, and multilingual models often have uneven performance across languages. This study explores an alternative solution by adapting large language models, primarily trained on English, to low-resource languages. We assess various strategies, including continual training, instruction fine-tuning, task-specific fine-tuning, and vocabulary extension. The results show that continual training improves language comprehension, as reflected in perplexity scores, and task-specific tuning generally enhances performance of downstream tasks. However, extending the vocabulary shows no substantial benefits. Additionally, while larger models improve task performance with few-shot tuning, multilingual models perform worse than their monolingual counterparts when adapted.
Anthology ID:
2024.mrl-1.3
Volume:
Proceedings of the Fourth Workshop on Multilingual Representation Learning (MRL 2024)
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Jonne Sälevä, Abraham Owodunni
Venue:
MRL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
30–44
Language:
URL:
https://aclanthology.org/2024.mrl-1.3
DOI:
Bibkey:
Cite (ACL):
Cagri Toraman. 2024. Adapting Open-Source Generative Large Language Models for Low-Resource Languages: A Case Study for Turkish. In Proceedings of the Fourth Workshop on Multilingual Representation Learning (MRL 2024), pages 30–44, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Adapting Open-Source Generative Large Language Models for Low-Resource Languages: A Case Study for Turkish (Toraman, MRL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.mrl-1.3.pdf