A Benchmark Study of Contrastive Learning for Arabic Social Meaning

Md Tawkat Islam Khondaker, El Moatez Billah Nagoudi, AbdelRahim Elmadany, Muhammad Abdul-Mageed, Laks Lakshmanan, V.S.


Abstract
Contrastive learning (CL) has brought significant progress to various NLP tasks. Despite such a progress, CL has not been applied to Arabic NLP. Nor is it clear how much benefits it could bring to particular classes of tasks such as social meaning (e.g., sentiment analysis, dialect identification, hate speech detection). In this work, we present a comprehensive benchmark study of state-of-the-art supervised CL methods on a wide array of Arabic social meaning tasks. Through an extensive empirical analysis, we show that CL methods outperform vanilla finetuning on most of the tasks. We also show that CL can be data efficient and quantify this efficiency, demonstrating the promise of these methods in low-resource settings vis-a-vis the particular downstream tasks (especially label granularity).
Anthology ID:
2022.wanlp-1.7
Volume:
Proceedings of the Seventh Arabic Natural Language Processing Workshop (WANLP)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Houda Bouamor, Hend Al-Khalifa, Kareem Darwish, Owen Rambow, Fethi Bougares, Ahmed Abdelali, Nadi Tomeh, Salam Khalifa, Wajdi Zaghouani
Venue:
WANLP
SIG:
SIGARAB
Publisher:
Association for Computational Linguistics
Note:
Pages:
63–75
Language:
URL:
https://aclanthology.org/2022.wanlp-1.7
DOI:
10.18653/v1/2022.wanlp-1.7
Bibkey:
Cite (ACL):
Md Tawkat Islam Khondaker, El Moatez Billah Nagoudi, AbdelRahim Elmadany, Muhammad Abdul-Mageed, and Laks Lakshmanan, V.S.. 2022. A Benchmark Study of Contrastive Learning for Arabic Social Meaning. In Proceedings of the Seventh Arabic Natural Language Processing Workshop (WANLP), pages 63–75, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
A Benchmark Study of Contrastive Learning for Arabic Social Meaning (Khondaker et al., WANLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.wanlp-1.7.pdf