GLiNER: Generalist Model for Named Entity Recognition using Bidirectional Transformer

Urchade Zaratiana, Nadi Tomeh, Pierre Holat, Thierry Charnois


Abstract
Named Entity Recognition (NER) is essential in various Natural Language Processing (NLP) applications. Traditional NER models are effective but limited to a set of predefined entity types. In contrast, Large Language Models (LLMs) can extract arbitrary entities through natural language instructions, offering greater flexibility. However, their size and cost, particularly for those accessed via APIs like ChatGPT, make them impractical in resource-limited scenarios. In this paper, we introduce a compact NER model trained to identify any type of entity. Leveraging a bidirectional transformer encoder, our model, GLiNER, facilitates parallel entity extraction, an advantage over the slow sequential token generation of LLMs. Through comprehensive testing, GLiNER demonstrate strong performance, outperforming both ChatGPT and fine-tuned LLMs in zero-shot evaluations on various NER benchmarks.
Anthology ID:
2024.naacl-long.300
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5364–5376
Language:
URL:
https://aclanthology.org/2024.naacl-long.300
DOI:
Bibkey:
Cite (ACL):
Urchade Zaratiana, Nadi Tomeh, Pierre Holat, and Thierry Charnois. 2024. GLiNER: Generalist Model for Named Entity Recognition using Bidirectional Transformer. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 5364–5376, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
GLiNER: Generalist Model for Named Entity Recognition using Bidirectional Transformer (Zaratiana et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.300.pdf
Copyright:
 2024.naacl-long.300.copyright.pdf