Contextual Augmentation for Entity Linking using Large Language Models

Daniel Vollmers, Hamada Zahera, Diego Moussallem, Axel-Cyrille Ngonga Ngomo


Abstract
Entity Linking involves detecting and linking entity mentions in natural language texts to a knowledge graph. Traditional methods use a two-step process with separate models for entity recognition and disambiguation, which can be computationally intensive and less effective. We propose a fine-tuned model that jointly integrates entity recognition and disambiguation in a unified framework. Furthermore, our approach leverages large language models to enrich the context of entity mentions, yielding better disambiguation. We evaluated our approach on benchmark datasets and compared with several baselines. The evaluation results show that our approach achieves state-of-the-art performance on out-of-domain datasets.
Anthology ID:
2025.coling-main.570
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8535–8545
Language:
URL:
https://aclanthology.org/2025.coling-main.570/
DOI:
Bibkey:
Cite (ACL):
Daniel Vollmers, Hamada Zahera, Diego Moussallem, and Axel-Cyrille Ngonga Ngomo. 2025. Contextual Augmentation for Entity Linking using Large Language Models. In Proceedings of the 31st International Conference on Computational Linguistics, pages 8535–8545, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Contextual Augmentation for Entity Linking using Large Language Models (Vollmers et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.570.pdf