Deep Learning Meets Egyptology: a Hieroglyphic Transformer for Translating Ancient Egyptian

Mattia Cao, Nicola De Cao, Angelo Colonna, Alessandro Lenci


Abstract
This work explores the potential of Transformer models focusing on the translation of ancient Egyptian hieroglyphs. We present a novel Hieroglyphic Transformer model, built upon the powerful M2M-100 multilingual translation framework and trained on a dataset we customised from the Thesaurus Linguae Aegyptiae database. Our experiments demonstrate promising results, with the model achieving significant accuracy in translating hieroglyphics into both German and English. This work holds significant implications for Egyptology, potentially accelerating the translation process and unlocking new research approaches.
Anthology ID:
2024.ml4al-1.9
Volume:
Proceedings of the 1st Workshop on Machine Learning for Ancient Languages (ML4AL 2024)
Month:
August
Year:
2024
Address:
Hybrid in Bangkok, Thailand and online
Editors:
John Pavlopoulos, Thea Sommerschield, Yannis Assael, Shai Gordin, Kyunghyun Cho, Marco Passarotti, Rachele Sprugnoli, Yudong Liu, Bin Li, Adam Anderson
Venues:
ML4AL | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
71–86
Language:
URL:
https://aclanthology.org/2024.ml4al-1.9
DOI:
Bibkey:
Cite (ACL):
Mattia Cao, Nicola De Cao, Angelo Colonna, and Alessandro Lenci. 2024. Deep Learning Meets Egyptology: a Hieroglyphic Transformer for Translating Ancient Egyptian. In Proceedings of the 1st Workshop on Machine Learning for Ancient Languages (ML4AL 2024), pages 71–86, Hybrid in Bangkok, Thailand and online. Association for Computational Linguistics.
Cite (Informal):
Deep Learning Meets Egyptology: a Hieroglyphic Transformer for Translating Ancient Egyptian (Cao et al., ML4AL-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.ml4al-1.9.pdf