AraModernBERT: Transtokenized Initialization and Long-Context Encoder Modeling for Arabic

Omar Elshehy, Omer Nacar, Abdelbasset Djamai, Muhammed Ragab, Khloud AL Jallad, Mona Abdelazim


Abstract
Encoder-only transformer models remain widely used for discriminative NLP tasks, yet recent architectural advances have largely focused on English. In this work, we present AraModernBERT, an adaptation of the ModernBERT encoder architecture to Arabic, and study the impact of transtokenized embedding initialization and native long-context modeling up to 8,192 tokens. We show that transtokenization is essential for Arabic language modeling, yielding dramatic improvements in masked language modeling performance compared to non-transtokenized initialization. We further demonstrate that AraModernBERT supports stable and effective long-context modeling, achieving improved intrinsic language modeling performance at extended sequence lengths. Downstream evaluations on Arabic natural language understanding tasks, including inference, offensive language detection, question-question similarity, and named entity recognition, confirm strong transfer to discriminative and sequence labeling settings. Our results highlight practical considerations for adapting modern encoder architectures to Arabic and other languages written in Arabic-derived scripts.
Anthology ID:
2026.abjadnlp-1.39
Volume:
Proceedings of the 2nd Workshop on NLP for Languages Using Arabic Script
Month:
March
Year:
2026
Address:
Rabat, Morocco
Venues:
AbjadNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
313–321
Language:
URL:
https://aclanthology.org/2026.abjadnlp-1.39/
DOI:
Bibkey:
Cite (ACL):
Omar Elshehy, Omer Nacar, Abdelbasset Djamai, Muhammed Ragab, Khloud AL Jallad, and Mona Abdelazim. 2026. AraModernBERT: Transtokenized Initialization and Long-Context Encoder Modeling for Arabic. In Proceedings of the 2nd Workshop on NLP for Languages Using Arabic Script, pages 313–321, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
AraModernBERT: Transtokenized Initialization and Long-Context Encoder Modeling for Arabic (Elshehy et al., AbjadNLP 2026)
Copy Citation:
PDF:
https://aclanthology.org/2026.abjadnlp-1.39.pdf