LlmLink: Dual LLMs for Dynamic Entity Linking on Long Narratives with Collaborative Memorisation and Prompt Optimisation

Lixing Zhu, Jun Wang, Yulan He


Abstract
We address the task of CoREFerence resolution (CoREF) in chunked long narratives. Existing approaches remain either focused on supervised fine-tuning or limited to one-off prediction, which poses a challenge where the context is long. We develop a dynamic approach to cope with this: by deploying dual Large Language Models (LLMs), we assign specialised LLMs to local named entity recognition and distant CoREF tasks, respectively, while ensuring their exchange of information. Utilising our novel memorisation schemes, the coreference resolution LLM would memorise characters and their associated descriptions, thereby reducing token consumption compared with storing previous messages. To alleviate hallucinations of LLMs, we employ an automatic prompt optimisation method, with the LLM ranker modified to leverage annotations. Our approach achieves performance gains over other LLM-based models and fine-tuning approaches on long narrative datasets, significantly reducing the resources required for inference and training.
Anthology ID:
2025.coling-main.751
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11334–11347
Language:
URL:
https://aclanthology.org/2025.coling-main.751/
DOI:
Bibkey:
Cite (ACL):
Lixing Zhu, Jun Wang, and Yulan He. 2025. LlmLink: Dual LLMs for Dynamic Entity Linking on Long Narratives with Collaborative Memorisation and Prompt Optimisation. In Proceedings of the 31st International Conference on Computational Linguistics, pages 11334–11347, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
LlmLink: Dual LLMs for Dynamic Entity Linking on Long Narratives with Collaborative Memorisation and Prompt Optimisation (Zhu et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.751.pdf