Rethinking Negative Instances for Generative Named Entity Recognition

Yuyang Ding, Juntao Li, Pinzheng Wang, Zecheng Tang, Yan Bowen, Min Zhang


Abstract
Large Language Models (LLMs) have demonstrated impressive capabilities for generalizing in unseen tasks. In the Named Entity Recognition (NER) task, recent advancements have seen the remarkable improvement of LLMs in a broad range of entity domains via instruction tuning, by adopting entity-centric schema. In this work, we explore the potential enhancement of the existing methods by incorporating negative instances into training. Our experiments reveal that negative instances contribute to remarkable improvements by (1) introducing contextual information, and (2) clearly delineating label boundaries. Furthermore, we introduce an efficient longest common subsequence (LCS) matching algorithm, which is tailored to transform unstructured predictions into structured entities. By integrating these components, we present GNER, a Generative NER system that shows improved zero-shot performance across unseen entity domains. Our comprehensive evaluation illustrates our system’s superiority, surpassing state-of-the-art (SoTA) methods by 9 F1 score in zero-shot evaluation.
Anthology ID:
2024.findings-acl.206
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3461–3475
Language:
URL:
https://aclanthology.org/2024.findings-acl.206
DOI:
Bibkey:
Cite (ACL):
Yuyang Ding, Juntao Li, Pinzheng Wang, Zecheng Tang, Yan Bowen, and Min Zhang. 2024. Rethinking Negative Instances for Generative Named Entity Recognition. In Findings of the Association for Computational Linguistics ACL 2024, pages 3461–3475, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
Rethinking Negative Instances for Generative Named Entity Recognition (Ding et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.206.pdf