A Sequence Learning Method for Domain-Specific Entity Linking

Emrah Inan, Oguz Dikenelli


Abstract
Recent collective Entity Linking studies usually promote global coherence of all the mapped entities in the same document by using semantic embeddings and graph-based approaches. Although graph-based approaches are shown to achieve remarkable results, they are computationally expensive for general datasets. Also, semantic embeddings only indicate relatedness between entity pairs without considering sequences. In this paper, we address these problems by introducing a two-fold neural model. First, we match easy mention-entity pairs and using the domain information of this pair to filter candidate entities of closer mentions. Second, we resolve more ambiguous pairs using bidirectional Long Short-Term Memory and CRF models for the entity disambiguation. Our proposed system outperforms state-of-the-art systems on the generated domain-specific evaluation dataset.
Anthology ID:
W18-2403
Volume:
Proceedings of the Seventh Named Entities Workshop
Month:
July
Year:
2018
Address:
Melbourne, Australia
Venues:
ACL | NEWS | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14–21
Language:
URL:
https://aclanthology.org/W18-2403
DOI:
10.18653/v1/W18-2403
Bibkey:
Copy Citation:
PDF:
https://aclanthology.org/W18-2403.pdf