Large Language Models Only Pass Primary School Exams in Indonesia: A Comprehensive Test on IndoMMLU

Fajri Koto, Nurul Aisyah, Haonan Li, Timothy Baldwin


Abstract
Although large language models (LLMs) are often pre-trained on large-scale multilingual texts, their reasoning abilities and real-world knowledge are mainly evaluated based on English datasets. Assessing LLM capabilities beyond English is increasingly vital but hindered due to the lack of suitable datasets. In this work, we introduce IndoMMLU, the first multi-task language understanding benchmark for Indonesian culture and languages, which consists of questions from primary school to university entrance exams in Indonesia. By employing professional teachers, we obtain 14,981 questions across 64 tasks and education levels, with 46% of the questions focusing on assessing proficiency in the Indonesian language and knowledge of nine local languages and cultures in Indonesia. Our empirical evaluations show that GPT-3.5 only manages to pass the Indonesian primary school level, with limited knowledge of local Indonesian languages and culture. Other smaller models such as BLOOMZ and Falcon perform at even lower levels.
Anthology ID:
2023.emnlp-main.760
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12359–12374
Language:
URL:
https://aclanthology.org/2023.emnlp-main.760
DOI:
10.18653/v1/2023.emnlp-main.760
Bibkey:
Cite (ACL):
Fajri Koto, Nurul Aisyah, Haonan Li, and Timothy Baldwin. 2023. Large Language Models Only Pass Primary School Exams in Indonesia: A Comprehensive Test on IndoMMLU. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 12359–12374, Singapore. Association for Computational Linguistics.
Cite (Informal):
Large Language Models Only Pass Primary School Exams in Indonesia: A Comprehensive Test on IndoMMLU (Koto et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.760.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.760.mp4