Unsupervised Named Entity Disambiguation for Low Resource Domains

Debarghya Datta, Soumajit Pramanik


Abstract
In the ever-evolving landscape of natural language processing and information retrieval, the need for robust and domain-specific entity linking algorithms has become increasingly apparent. It is crucial in a considerable number of fields such as humanities, technical writing and biomedical sciences to enrich texts with semantics and discover more knowledge. The use of Named Entity Disambiguation (NED) in such domains requires handling noisy texts, low resource settings and domain-specific KBs. Existing approaches are mostly inappropriate for such scenarios, as they either depend on training data or are not flexible enough to work with domain-specific KBs. Thus in this work, we present a unsupervised approach leveraging the concept of Group Steiner Trees (GST), which can identify the most relevant candidate for entity disambiguation using the contextual similarities across candidate entities for all the mentions present in a document. We outperform the state-of-the-art unsupervised methods by more than 40%(in avg) in terms of Precision@1 and Hit@5 across various domain-specific datasets.
Anthology ID:
2024.emnlp-main.830
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14922–14928
Language:
URL:
https://aclanthology.org/2024.emnlp-main.830
DOI:
Bibkey:
Cite (ACL):
Debarghya Datta and Soumajit Pramanik. 2024. Unsupervised Named Entity Disambiguation for Low Resource Domains. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 14922–14928, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Unsupervised Named Entity Disambiguation for Low Resource Domains (Datta & Pramanik, EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.830.pdf