Evaluating the Performance of Transformers in Translating Low-Resource Languages through Akkadian

Daniel A. Jones, Ruslan Mitkov


Abstract
In this paper, we evaluate the performance of various fine-tuned, transformer-based models in translating Akkadian into English. Using annotated Akkadian data, we seek to establish potential considerations when developing models for other low-resource languages, which do not yet have as robust data. The results of this study show the potency, but also cost inefficiency, of Large Language Models compared to smaller Neural Machine Translation models. Significant evidence was also found demonstrating the importance of fine-tuning machine translation models from related languages.
Anthology ID:
2025.r2lm-1.5
Volume:
Proceedings of the First Workshop on Comparative Performance Evaluation: From Rules to Language Models
Month:
September
Year:
2025
Address:
Varna, Bulgaria
Editors:
Alicia Picazo-Izquierdo, Ernesto Luis Estevanell-Valladares, Ruslan Mitkov, Rafael Muñoz Guillena, Raúl García Cerdá
Venues:
R2LM | WS
SIG:
Publisher:
INCOMA Ltd., Shoumen, Bulgaria
Note:
Pages:
39–47
Language:
URL:
https://aclanthology.org/2025.r2lm-1.5/
DOI:
Bibkey:
Cite (ACL):
Daniel A. Jones and Ruslan Mitkov. 2025. Evaluating the Performance of Transformers in Translating Low-Resource Languages through Akkadian. In Proceedings of the First Workshop on Comparative Performance Evaluation: From Rules to Language Models, pages 39–47, Varna, Bulgaria. INCOMA Ltd., Shoumen, Bulgaria.
Cite (Informal):
Evaluating the Performance of Transformers in Translating Low-Resource Languages through Akkadian (Jones & Mitkov, R2LM 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.r2lm-1.5.pdf