Parameter-Efficient Adaptation of Self-Supervised Models for Arabic Speech Recognition

Wafa Mohammed Alshehri, Wasfi G. Al-khatib, Mohammad Ismail Amro


Abstract
Arabic speech recognition systems face distinct challenges due to the language’s complex morphology and dialectal variations. Self-supervised models (SSL) like XLS-R have shown promising results, but their size with over than 300 million of parameters, makes fine-tuning computationally expensive. In this work, we present the first comparative study of parameter-efficient fine-tuning (PEFT), specifically LoRA and DoRA, applied to XLS-R for Arabic ASR. We evaluate on the newly released Common Voice Arabic V24.0 dataset, establishing new benchmarks. Our full fine-tuning achieves state-of-the-art results among XLS-R-based models with 23.03% Word Error Rate (WER). In our experiments, LoRA achieved a 36.10% word error rate (WER) while training just 2% of the model’s parameters. DoRA reached 45.20% WER in initial experiments. We analyze the trade-offs between accuracy and efficiency, offering practical guidance for developing Arabic ASR systems when computational resources are limited. The models and code are publicly available.
Anthology ID:
2026.abjadnlp-1.40
Volume:
Proceedings of the 2nd Workshop on NLP for Languages Using Arabic Script
Month:
March
Year:
2026
Address:
Rabat, Morocco
Venues:
AbjadNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
322–328
Language:
URL:
https://aclanthology.org/2026.abjadnlp-1.40/
DOI:
Bibkey:
Cite (ACL):
Wafa Mohammed Alshehri, Wasfi G. Al-khatib, and Mohammad Ismail Amro. 2026. Parameter-Efficient Adaptation of Self-Supervised Models for Arabic Speech Recognition. In Proceedings of the 2nd Workshop on NLP for Languages Using Arabic Script, pages 322–328, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Parameter-Efficient Adaptation of Self-Supervised Models for Arabic Speech Recognition (Alshehri et al., AbjadNLP 2026)
Copy Citation:
PDF:
https://aclanthology.org/2026.abjadnlp-1.40.pdf
Optionalsupplementarymaterial:
 2026.abjadnlp-1.40.OptionalSupplementaryMaterial.docx