%0 Conference Proceedings %T Improving Entity Disambiguation by Reasoning over a Knowledge Base %A Ayoola, Tom %A Fisher, Joseph %A Pierleoni, Andrea %Y Carpuat, Marine %Y de Marneffe, Marie-Catherine %Y Meza Ruiz, Ivan Vladimir %S Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies %D 2022 %8 July %I Association for Computational Linguistics %C Seattle, United States %F ayoola-etal-2022-improving %X Recent work in entity disambiguation (ED) has typically neglected structured knowledge base (KB) facts, and instead relied on a limited subset of KB information, such as entity descriptions or types. This limits the range of contexts in which entities can be disambiguated. To allow the use of all KB facts, as well as descriptions and types, we introduce an ED model which links entities by reasoning over a symbolic knowledge base in a fully differentiable fashion. Our model surpasses state-of-the-art baselines on six well-established ED datasets by 1.3 F1 on average. By allowing access to all KB information, our model is less reliant on popularity-based entity priors, and improves performance on the challenging ShadowLink dataset (which emphasises infrequent and ambiguous entities) by 12.7 F1. %R 10.18653/v1/2022.naacl-main.210 %U https://aclanthology.org/2022.naacl-main.210 %U https://doi.org/10.18653/v1/2022.naacl-main.210 %P 2899-2912