SENIT at AraFinNLP2024: trust your model or combine two

Abdelmomen Nasr, Moez Ben HajHmida


Abstract
We describe our submitted system to the 2024 Shared Task on The Arabic Financial NLP (Malaysha et al., 2024). We tackled Subtask 1, namely Multi-dialect Intent Detection. We used state-of-the-art pretrained contextualized text representation models and fine-tuned them according to the downstream task at hand. We started by finetuning multilingual BERT and various Arabic variants, namely MARBERTV1, MARBERTV2, and CAMeLBERT. Then, we employed an ensembling technique to improve our classification performance combining MARBERTV2 and CAMeLBERT embeddings. The findings indicate that MARBERTV2 surpassed all the other models mentioned.
Anthology ID:
2024.arabicnlp-1.39
Volume:
Proceedings of The Second Arabic Natural Language Processing Conference
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Nizar Habash, Houda Bouamor, Ramy Eskander, Nadi Tomeh, Ibrahim Abu Farha, Ahmed Abdelali, Samia Touileb, Injy Hamed, Yaser Onaizan, Bashar Alhafni, Wissam Antoun, Salam Khalifa, Hatem Haddad, Imed Zitouni, Badr AlKhamissi, Rawan Almatham, Khalil Mrini
Venues:
ArabicNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
428–432
Language:
URL:
https://aclanthology.org/2024.arabicnlp-1.39
DOI:
Bibkey:
Cite (ACL):
Abdelmomen Nasr and Moez Ben HajHmida. 2024. SENIT at AraFinNLP2024: trust your model or combine two. In Proceedings of The Second Arabic Natural Language Processing Conference, pages 428–432, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
SENIT at AraFinNLP2024: trust your model or combine two (Nasr & Ben HajHmida, ArabicNLP-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.arabicnlp-1.39.pdf