Retrieval Augmented Instruction Tuning for Open NER with Large Language Models

Tingyu Xie, Jian Zhang, Yan Zhang, Yuanyuan Liang, Qi Li, Hongwei Wang


Abstract
The strong capability of large language models (LLMs) has been applied to information extraction (IE) through either retrieval augmented prompting or instruction tuning (IT). However, the best way to incorporate information with LLMs for IE remains an open question. In this paper, we explore Retrieval Augmented Instruction Tuning (RA-IT) for IE, focusing on the task of open named entity recognition (NER). Specifically, for each training sample, we retrieve semantically similar examples from the training dataset as the context and prepend them to the input of the original instruction. To evaluate our RA-IT approach more thoroughly, we construct a Chinese IT dataset for open NER and evaluate RA-IT in both English and Chinese scenarios. Experimental results verify the effectiveness of RA-IT across various data sizes and in both English and Chinese scenarios. We also conduct thorough studies to explore the impacts of various retrieval strategies in the proposed RA-IT framework.
Anthology ID:
2025.coling-main.196
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2904–2918
Language:
URL:
https://aclanthology.org/2025.coling-main.196/
DOI:
Bibkey:
Cite (ACL):
Tingyu Xie, Jian Zhang, Yan Zhang, Yuanyuan Liang, Qi Li, and Hongwei Wang. 2025. Retrieval Augmented Instruction Tuning for Open NER with Large Language Models. In Proceedings of the 31st International Conference on Computational Linguistics, pages 2904–2918, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Retrieval Augmented Instruction Tuning for Open NER with Large Language Models (Xie et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.196.pdf