Mitigating Out-of-Entity Errors in Named Entity Recognition: A Sentence-Level Strategy

Guochao Jiang, Ziqin Luo, Chengwei Hu, Zepeng Ding, Deqing Yang


Abstract
Many previous models of named entity recognition (NER) suffer from the problem of Out-of-Entity (OOE), i.e., the tokens in the entity mentions of the test samples have not appeared in the training samples, which hinders the achievement of satisfactory performance. To improve OOE-NER performance, in this paper, we propose a new framework, namely S+NER, which fully leverages sentence-level information. Our S+NER achieves better OOE-NER performance mainly due to the following two particular designs. 1) It first exploits the pre-trained language model’s capability of understanding the target entity’s sentence-level context with a template set. 2) Then, it refines the sentence-level representation based on the positive and negative templates, through a contrastive learning strategy and template pooling method, to obtain better NER results. Our extensive experiments on five benchmark datasets have demonstrated that, our S+NER outperforms some state-of-the-art OOE-NER models.
Anthology ID:
2025.coling-main.519
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7754–7765
Language:
URL:
https://aclanthology.org/2025.coling-main.519/
DOI:
Bibkey:
Cite (ACL):
Guochao Jiang, Ziqin Luo, Chengwei Hu, Zepeng Ding, and Deqing Yang. 2025. Mitigating Out-of-Entity Errors in Named Entity Recognition: A Sentence-Level Strategy. In Proceedings of the 31st International Conference on Computational Linguistics, pages 7754–7765, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Mitigating Out-of-Entity Errors in Named Entity Recognition: A Sentence-Level Strategy (Jiang et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.519.pdf