CATT: Character-based Arabic Tashkeel Transformer

Faris Alasmary, Orjuwan Zaafarani, Ahmad Ghannam


Abstract
Tashkeel, or Arabic Text Diacritization (ATD), greatly enhances the comprehension of Arabic text by removing ambiguity and minimizing the risk of misinterpretations caused by its absence.It plays a crucial role in improving Arabic text processing, particularly in applications such as text-to-speech and machine translation.This paper introduces a new approach to training ATD models.First, we finetuned two transformers, encoder-only and encoder-decoder, that were initialized from a pretrained character-based BERT.Then, we applied the Noisy-Student approach to boost the performance of the best model.We evaluated our models alongside 11 commercial and open-source models using two manually labeled benchmark datasets: WikiNews and our CATT dataset.Our findings show that our top model surpasses all evaluated models by relative Diacritic Error Rates (DERs) of 30.83% and 35.21% on WikiNews and CATT, respectively, achieving state-of-the-art in ATD.In addition, we show that our model outperforms GPT-4-turbo on CATT dataset by a relative DER of 9.36%.We open-source our CATT models and benchmark dataset for the research community .
Anthology ID:
2024.arabicnlp-1.21
Volume:
Proceedings of The Second Arabic Natural Language Processing Conference
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Nizar Habash, Houda Bouamor, Ramy Eskander, Nadi Tomeh, Ibrahim Abu Farha, Ahmed Abdelali, Samia Touileb, Injy Hamed, Yaser Onaizan, Bashar Alhafni, Wissam Antoun, Salam Khalifa, Hatem Haddad, Imed Zitouni, Badr AlKhamissi, Rawan Almatham, Khalil Mrini
Venues:
ArabicNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
250–257
Language:
URL:
https://aclanthology.org/2024.arabicnlp-1.21
DOI:
10.18653/v1/2024.arabicnlp-1.21
Bibkey:
Cite (ACL):
Faris Alasmary, Orjuwan Zaafarani, and Ahmad Ghannam. 2024. CATT: Character-based Arabic Tashkeel Transformer. In Proceedings of The Second Arabic Natural Language Processing Conference, pages 250–257, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
CATT: Character-based Arabic Tashkeel Transformer (Alasmary et al., ArabicNLP-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.arabicnlp-1.21.pdf