Mixing Context Granularities for Improved Entity Linking on Question Answering Data across Entity Categories

Daniil Sorokin, Iryna Gurevych


Abstract
The first stage of every knowledge base question answering approach is to link entities in the input question. We investigate entity linking in the context of question answering task and present a jointly optimized neural architecture for entity mention detection and entity disambiguation that models the surrounding context on different levels of granularity. We use the Wikidata knowledge base and available question answering datasets to create benchmarks for entity linking on question answering data. Our approach outperforms the previous state-of-the-art system on this data, resulting in an average 8% improvement of the final score. We further demonstrate that our model delivers a strong performance across different entity categories.
Anthology ID:
S18-2007
Volume:
Proceedings of the Seventh Joint Conference on Lexical and Computational Semantics
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Venues:
*SEM | SemEval
SIGs:
SIGSEM | SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
65–75
Language:
URL:
https://aclanthology.org/S18-2007
DOI:
10.18653/v1/S18-2007
Bibkey:
Copy Citation:
PDF:
https://aclanthology.org/S18-2007.pdf