TCE at Qur’an QA 2023 Shared Task: Low Resource Enhanced Transformer-based Ensemble Approach for Qur’anic QA

Mohammed Elkomy, Amany Sarhan


Abstract
In this paper, we present our approach to tackle Qur’an QA 2023 shared tasks A and B. To address the challenge of low-resourced training data, we rely on transfer learning together with a voting ensemble to improve prediction stability across multiple runs. Additionally, we employ different architectures and learning mechanisms for a range of Arabic pre-trained transformer-based models for both tasks. To identify unanswerable questions, we propose using a thresholding mechanism. Our top-performing systems greatly surpass the baseline performance on the hidden split, achieving a MAP score of 25.05% for task A and a partial Average Precision (pAP) of 57.11% for task B.
Anthology ID:
2023.arabicnlp-1.81
Volume:
Proceedings of ArabicNLP 2023
Month:
December
Year:
2023
Address:
Singapore (Hybrid)
Editors:
Hassan Sawaf, Samhaa El-Beltagy, Wajdi Zaghouani, Walid Magdy, Ahmed Abdelali, Nadi Tomeh, Ibrahim Abu Farha, Nizar Habash, Salam Khalifa, Amr Keleg, Hatem Haddad, Imed Zitouni, Khalil Mrini, Rawan Almatham
Venues:
ArabicNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
728–742
Language:
URL:
https://aclanthology.org/2023.arabicnlp-1.81
DOI:
10.18653/v1/2023.arabicnlp-1.81
Bibkey:
Cite (ACL):
Mohammed Elkomy and Amany Sarhan. 2023. TCE at Qur’an QA 2023 Shared Task: Low Resource Enhanced Transformer-based Ensemble Approach for Qur’anic QA. In Proceedings of ArabicNLP 2023, pages 728–742, Singapore (Hybrid). Association for Computational Linguistics.
Cite (Informal):
TCE at Qur’an QA 2023 Shared Task: Low Resource Enhanced Transformer-based Ensemble Approach for Qur’anic QA (Elkomy & Sarhan, ArabicNLP-WS 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.arabicnlp-1.81.pdf
Video:
 https://aclanthology.org/2023.arabicnlp-1.81.mp4