Low-Rank Subspaces for Unsupervised Entity Linking

Akhil Arora, Alberto Garcia-Duran, Robert West


Abstract
Entity linking is an important problem with many applications. Most previous solutions were designed for settings where annotated training data is available, which is, however, not the case in numerous domains. We propose a light-weight and scalable entity linking method, Eigenthemes, that relies solely on the availability of entity names and a referent knowledge base. Eigenthemes exploits the fact that the entities that are truly mentioned in a document (the “gold entities”) tend to form a semantically dense subset of the set of all candidate entities in the document. Geometrically speaking, when representing entities as vectors via some given embedding, the gold entities tend to lie in a low-rank subspace of the full embedding space. Eigenthemes identifies this subspace using the singular value decomposition and scores candidate entities according to their proximity to the subspace. On the empirical front, we introduce multiple strong baselines that compare favorably to (and sometimes even outperform) the existing state of the art. Extensive experiments on benchmark datasets from a variety of real-world domains showcase the effectiveness of our approach.
Anthology ID:
2021.emnlp-main.634
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8037–8054
Language:
URL:
https://aclanthology.org/2021.emnlp-main.634
DOI:
10.18653/v1/2021.emnlp-main.634
Bibkey:
Cite (ACL):
Akhil Arora, Alberto Garcia-Duran, and Robert West. 2021. Low-Rank Subspaces for Unsupervised Entity Linking. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 8037–8054, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Low-Rank Subspaces for Unsupervised Entity Linking (Arora et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.634.pdf
Code
 epfl-dlab/eigenthemes