Improving Romanian LLM Pretraining Data using Diversity and Quality Filtering

Vlad-Andrei Negoiță, Mihai Masala, Traian Rebedea


Abstract
Large Language Models (LLMs) have recently exploded in popularity, often matching or outperforming human abilities on many tasks. One of the key factors in training LLMs is the availability and curation of high-quality data.Data quality is especially crucial for under-represented languages, where high-quality corpora are scarce. In this work we study the characteristics and coverage of Romanian pretraining corpora and we examine how they differ from English data. By training a lightweight multitask model on carefully LLM-annotated Romanian texts, we are able to analyze and perform multi-level filtering (e.g., educational value, topic, format) to generate high-quality pretraining datasets. Our experiments show noteworthy trends in the topics present in Romanian and English data, while also proving the effectiveness of filtering data through improved LLM pretraining performance across multiple benchmarks.
Anthology ID:
2026.loreslm-1.13
Volume:
Proceedings of the Second Workshop on Language Models for Low-Resource Languages (LoResLM 2026)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Hansi Hettiarachchi, Tharindu Ranasinghe, Alistair Plum, Paul Rayson, Ruslan Mitkov, Mohamed Gaber, Damith Premasiri, Fiona Anting Tan, Lasitha Uyangodage
Venue:
LoResLM
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
140–148
Language:
URL:
https://aclanthology.org/2026.loreslm-1.13/
DOI:
Bibkey:
Cite (ACL):
Vlad-Andrei Negoiță, Mihai Masala, and Traian Rebedea. 2026. Improving Romanian LLM Pretraining Data using Diversity and Quality Filtering. In Proceedings of the Second Workshop on Language Models for Low-Resource Languages (LoResLM 2026), pages 140–148, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Improving Romanian LLM Pretraining Data using Diversity and Quality Filtering (Negoiță et al., LoResLM 2026)
Copy Citation:
PDF:
https://aclanthology.org/2026.loreslm-1.13.pdf