Arabic Author Attribution Using Transformer-Based Models: Insights from the AbjadAuthorID Shared Task

Ghader Kurdi


Abstract
This paper describes the author’s participation in the Arabic track of the AbjadAuthorID shared task which focuses on multiclass authorship attribution using transformer-based models. The task involves identifying the author of a given text excerpt drawn from diverse genres and historical periods, posing significant challenges due to stylistic variation and linguistic richness. Experimental results demonstrate strong performance, with an ensemble of MAR BERTv2 and ARBERTv2 achieving achieving an accuracy of 92% and a macro-averaged F1 score of 89%, ranking second on the leader board, and highlighting the effectiveness of the proposed approach for Arabic authorship identification.
Anthology ID:
2026.abjadnlp-1.66
Volume:
Proceedings of the 2nd Workshop on NLP for Languages Using Arabic Script
Month:
March
Year:
2026
Address:
Rabat, Morocco
Venues:
AbjadNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
520–524
Language:
URL:
https://aclanthology.org/2026.abjadnlp-1.66/
DOI:
Bibkey:
Cite (ACL):
Ghader Kurdi. 2026. Arabic Author Attribution Using Transformer-Based Models: Insights from the AbjadAuthorID Shared Task. In Proceedings of the 2nd Workshop on NLP for Languages Using Arabic Script, pages 520–524, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Arabic Author Attribution Using Transformer-Based Models: Insights from the AbjadAuthorID Shared Task (Kurdi, AbjadNLP 2026)
Copy Citation:
PDF:
https://aclanthology.org/2026.abjadnlp-1.66.pdf