Bilingual BSARD: Extending Statutory Article Retrieval to Dutch

Ehsan Lotfi, Nikolay Banar, Nerses Yuzbashyan, Walter Daelemans


Abstract
Statutory article retrieval plays a crucial role in making legal information more accessible to both laypeople and legal professionals. Multilingual countries like Belgium present unique challenges for retrieval models due to the need for handling legal issues in multiple languages. Building on the Belgian Statutory Article Retrieval Dataset (BSARD) in French, we introduce the bilingual version of this dataset, bBSARD. The dataset contains parallel Belgian statutory articles in both French and Dutch, along with legal questions from BSARD and their Dutch translation. Using bBSARD, we conduct extensive benchmarking of retrieval models available for Dutch and French. Our benchmarking setup includes lexical models, zero-shot dense models, and fine-tuned small foundation models. Our experiments show that BM25 remains a competitive baseline compared to many zero-shot dense models in both languages. We also observe that while proprietary models outperform open alternatives in the zero-shot setting, they can be matched or surpassed by fine-tuning small language-specific models. Our dataset and evaluation code are publicly available.
Anthology ID:
2025.regnlp-1.3
Volume:
Proceedings of the 1st Regulatory NLP Workshop (RegNLP 2025)
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Tuba Gokhan, Kexin Wang, Iryna Gurevych, Ted Briscoe
Venues:
RegNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10–21
Language:
URL:
https://aclanthology.org/2025.regnlp-1.3/
DOI:
Bibkey:
Cite (ACL):
Ehsan Lotfi, Nikolay Banar, Nerses Yuzbashyan, and Walter Daelemans. 2025. Bilingual BSARD: Extending Statutory Article Retrieval to Dutch. In Proceedings of the 1st Regulatory NLP Workshop (RegNLP 2025), pages 10–21, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Bilingual BSARD: Extending Statutory Article Retrieval to Dutch (Lotfi et al., RegNLP 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.regnlp-1.3.pdf