From Zero to Hero: Building Serbian NER from Rules to LLMs

Milica Ikonić Nešić, Sasa Petalinkar, Ranka Stanković, Ruslan Mitkov


Abstract
Named Entity Recognition (NER) presents specific challenges in Serbian, a morphologically rich language. To address these challenges, a comparative evaluation of distinct model paradigms across diverse text genres was conducted. A rule-based system (SrpNER), a traditional deep learning model (Convolutional Neural Network – CNN), fine-tuned transformer architectures (Jerteh and Tesla), and Large Language Models (LLMs), specifically ChatGPT 4.0 Nano and 4.1 Mini, were evaluated and compared. For the LLMs, a one-shot prompt engineering approach was employed, using prompt instructions aligned with the entity type definitions used in the manual annotation guidelines. Evaluation was performed on three Serbian datasets representing varied domains: newspaper articles, history textbook excerpts, and a sample of literary texts from the srpELTeC collection. The highest performance was consistently achieved by the fine-tuned transformer models, with F1 scores ranging from 0.78 on newspaper articles to 0.96 on primary school history textbook sample.
Anthology ID:
2025.r2lm-1.10
Volume:
Proceedings of the First Workshop on Comparative Performance Evaluation: From Rules to Language Models
Month:
September
Year:
2025
Address:
Varna, Bulgaria
Editors:
Alicia Picazo-Izquierdo, Ernesto Luis Estevanell-Valladares, Ruslan Mitkov, Rafael Muñoz Guillena, Raúl García Cerdá
Venues:
R2LM | WS
SIG:
Publisher:
INCOMA Ltd., Shoumen, Bulgaria
Note:
Pages:
87–96
Language:
URL:
https://aclanthology.org/2025.r2lm-1.10/
DOI:
Bibkey:
Cite (ACL):
Milica Ikonić Nešić, Sasa Petalinkar, Ranka Stanković, and Ruslan Mitkov. 2025. From Zero to Hero: Building Serbian NER from Rules to LLMs. In Proceedings of the First Workshop on Comparative Performance Evaluation: From Rules to Language Models, pages 87–96, Varna, Bulgaria. INCOMA Ltd., Shoumen, Bulgaria.
Cite (Informal):
From Zero to Hero: Building Serbian NER from Rules to LLMs (Ikonić Nešić et al., R2LM 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.r2lm-1.10.pdf