Kosei Uemura
2024
Empowering the Future with Multilinguality and Language Diversity
En-Shiun Lee
|
Kosei Uemura
|
Syed Wasti
|
Mason Shipton
Proceedings of the Sixth Workshop on Teaching NLP
The rapid advancements and the widespread transformation of Large Language Models, have made it necessary to incorporate these cutting-edge techniques into the educational curricula of Natural Language Processing (NLP) with limited computing resources. This paper presents an applied NLP course designed for upper-year computer science undergraduate students on state-of-the-art techniques with an emphasis on multilinguality and language diversity. We hope to empower learners to advance their language community while preparing for industry.
AfriInstruct: Instruction Tuning of African Languages for Diverse Tasks
Kosei Uemura
|
Mahe Chen
|
Alex Pejovic
|
Chika Maduabuchi
|
Yifei Sun
|
En-Shiun Annie Lee
Findings of the Association for Computational Linguistics: EMNLP 2024
Large language models (LLMs) for African languages perform worse compared to their performance in high-resource languages. To address this issue, we introduce AfriInstruct, which specializes in instruction-tuning of multiple African languages covering various tasks. We trained the LLaMa-2-7B using continual pretraining and instruction fine-tuning, which demonstrates superior performance across multiple tasks. Our mixed task evaluation shows that our model outperforms GPT-3.5-Turbo and other baseline models of similar size. Our contributions fill a critical gap of LLM performance between high-resource and African languages.
Search
Co-authors
- En-Shiun Lee 1
- Syed Wasti 1
- Mason Shipton 1
- Mahe Chen 1
- Alex Pejovic 1
- show all...