Chika Maduabuchi
2024
AfriInstruct: Instruction Tuning of African Languages for Diverse Tasks
Kosei Uemura
|
Mahe Chen
|
Alex Pejovic
|
Chika Maduabuchi
|
Yifei Sun
|
En-Shiun Annie Lee
Findings of the Association for Computational Linguistics: EMNLP 2024
Large language models (LLMs) for African languages perform worse compared to their performance in high-resource languages. To address this issue, we introduce AfriInstruct, which specializes in instruction-tuning of multiple African languages covering various tasks. We trained the LLaMa-2-7B using continual pretraining and instruction fine-tuning, which demonstrates superior performance across multiple tasks. Our mixed task evaluation shows that our model outperforms GPT-3.5-Turbo and other baseline models of similar size. Our contributions fill a critical gap of LLM performance between high-resource and African languages.
Search