Marina Cidota
2021
jurBERT: A Romanian BERT Model for Legal Judgement Prediction
Mihai Masala
|
Radu Cristian Alexandru Iacob
|
Ana Sabina Uban
|
Marina Cidota
|
Horia Velicu
|
Traian Rebedea
|
Marius Popescu
Proceedings of the Natural Legal Language Processing Workshop 2021
Transformer-based models have become the de facto standard in the field of Natural Language Processing (NLP). By leveraging large unlabeled text corpora, they enable efficient transfer learning leading to state-of-the-art results on numerous NLP tasks. Nevertheless, for low resource languages and highly specialized tasks, transformer models tend to lag behind more classical approaches (e.g. SVM, LSTM) due to the lack of aforementioned corpora. In this paper we focus on the legal domain and we introduce a Romanian BERT model pre-trained on a large specialized corpus. Our model outperforms several strong baselines for legal judgement prediction on two different corpora consisting of cases from trials involving banks in Romania.
Search
Co-authors
- Mihai Masala 1
- Radu Cristian Alexandru Iacob 1
- Ana Sabina Uban 1
- Horia Velicu 1
- Traian Rebedea 1
- show all...
Venues
- nllp1