ArSarcasm Shared Task: An Ensemble BERT Model for SarcasmDetection in Arabic Tweets

Laila Bashmal, Daliyah AlZeer


Abstract
Detecting Sarcasm has never been easy for machines to process. In this work, we present our submission of the sub-task1 of the shared task on sarcasm and sentiment detection in Arabic organized by the 6th Workshop for Arabic Natural Language Processing. In this work, we explored different approaches based on BERT models. First, we fine-tuned the AraBERTv02 model for the sarcasm detection task. Then, we used the Sentence-BERT model trained with contrastive learning to extract representative tweet embeddings. Finally, inspired by how the human brain comprehends the surface and the implicit meanings of sarcastic tweets, we combined the sentence embedding with the fine-tuned AraBERTv02 to further boost the performance of the model. Through the ensemble of the two models, our team ranked 5th out of 27 teams on the shared task of sarcasm detection in Arabic, with an F1-score of %59.89 on the official test data. The obtained result is %2.36 lower than the 1st place which confirms the capabilities of the employed combined model in detecting sarcasm.
Anthology ID:
2021.wanlp-1.40
Volume:
Proceedings of the Sixth Arabic Natural Language Processing Workshop
Month:
April
Year:
2021
Address:
Kyiv, Ukraine (Virtual)
Editors:
Nizar Habash, Houda Bouamor, Hazem Hajj, Walid Magdy, Wajdi Zaghouani, Fethi Bougares, Nadi Tomeh, Ibrahim Abu Farha, Samia Touileb
Venue:
WANLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
323–328
Language:
URL:
https://aclanthology.org/2021.wanlp-1.40
DOI:
Bibkey:
Cite (ACL):
Laila Bashmal and Daliyah AlZeer. 2021. ArSarcasm Shared Task: An Ensemble BERT Model for SarcasmDetection in Arabic Tweets. In Proceedings of the Sixth Arabic Natural Language Processing Workshop, pages 323–328, Kyiv, Ukraine (Virtual). Association for Computational Linguistics.
Cite (Informal):
ArSarcasm Shared Task: An Ensemble BERT Model for SarcasmDetection in Arabic Tweets (Bashmal & AlZeer, WANLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.wanlp-1.40.pdf