AraLegal-BERT: A pretrained language model for Arabic Legal text

Muhammad Al-qurishi, Sarah Alqaseemi, Riad Souissi


Abstract
The effectiveness of the bidirectional encoder representations from transformers (BERT) model for multiple linguistic tasks is well documented. However, its potential for a narrow and specific domain, such as legal, has not been fully explored. In this study, we examine the use of BERT in the Arabic legal domain and customize this language model for several downstream tasks using different domain-relevant training and test datasets to train BERT from scratch. We introduce AraLegal-BERT, a bidirectional encoder transformer-based model that has been thoroughly tested and carefully optimized with the goal of amplifying the impact of natural language processing-driven solutions on jurisprudence, legal documents, and legal practice. We fine-tuned AraLegal-BERT and evaluated it against three BERT variants for the Arabic language in three natural language understanding tasks. The results showed that the base version of AraLegal-BERT achieved better accuracy than the typical and original BERT model concerning legal texts.
Anthology ID:
2022.nllp-1.31
Volume:
Proceedings of the Natural Legal Language Processing Workshop 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Nikolaos Aletras, Ilias Chalkidis, Leslie Barrett, Cătălina Goanță, Daniel Preoțiuc-Pietro
Venue:
NLLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
338–344
Language:
URL:
https://aclanthology.org/2022.nllp-1.31
DOI:
10.18653/v1/2022.nllp-1.31
Bibkey:
Cite (ACL):
Muhammad Al-qurishi, Sarah Alqaseemi, and Riad Souissi. 2022. AraLegal-BERT: A pretrained language model for Arabic Legal text. In Proceedings of the Natural Legal Language Processing Workshop 2022, pages 338–344, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
AraLegal-BERT: A pretrained language model for Arabic Legal text (Al-qurishi et al., NLLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.nllp-1.31.pdf