A Pilot Study for BERT Language Modelling and Morphological Analysis for Ancient and Medieval Greek

Pranaydeep Singh, Gorik Rutten, Els Lefever


Abstract
This paper presents a pilot study to automatic linguistic preprocessing of Ancient and Byzantine Greek, and morphological analysis more specifically. To this end, a novel subword-based BERT language model was trained on the basis of a varied corpus of Modern, Ancient and Post-classical Greek texts. Consequently, the obtained BERT embeddings were incorporated to train a fine-grained Part-of-Speech tagger for Ancient and Byzantine Greek. In addition, a corpus of Greek Epigrams was manually annotated and the resulting gold standard was used to evaluate the performance of the morphological analyser on Byzantine Greek. The experimental results show very good perplexity scores (4.9) for the BERT language model and state-of-the-art performance for the fine-grained Part-of-Speech tagger for in-domain data (treebanks containing a mixture of Classical and Medieval Greek), as well as for the newly created Byzantine Greek gold standard data set. The language models and associated code are made available for use at https://github.com/pranaydeeps/Ancient-Greek-BERT
Anthology ID:
2021.latechclfl-1.15
Volume:
Proceedings of the 5th Joint SIGHUM Workshop on Computational Linguistics for Cultural Heritage, Social Sciences, Humanities and Literature
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic (online)
Venues:
CLFL | EMNLP | LaTeCH | LaTeCHCLfL
SIG:
SIGHUM
Publisher:
Association for Computational Linguistics
Note:
Pages:
128–137
Language:
URL:
https://aclanthology.org/2021.latechclfl-1.15
DOI:
Bibkey:
Copy Citation:
PDF:
https://aclanthology.org/2021.latechclfl-1.15.pdf
Code
 pranaydeeps/ancient-greek-bert