A Semantics-aware Transformer Model of Relation Linking for Knowledge Base Question Answering

Tahira Naseem, Srinivas Ravishankar, Nandana Mihindukulasooriya, Ibrahim Abdelaziz, Young-Suk Lee, Pavan Kapanipathi, Salim Roukos, Alfio Gliozzo, Alexander Gray


Abstract
Relation linking is a crucial component of Knowledge Base Question Answering systems. Existing systems use a wide variety of heuristics, or ensembles of multiple systems, heavily relying on the surface question text. However, the explicit semantic parse of the question is a rich source of relation information that is not taken advantage of. We propose a simple transformer-based neural model for relation linking that leverages the AMR semantic parse of a sentence. Our system significantly outperforms the state-of-the-art on 4 popular benchmark datasets. These are based on either DBpedia or Wikidata, demonstrating that our approach is effective across KGs.
Anthology ID:
2021.acl-short.34
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
256–262
Language:
URL:
https://aclanthology.org/2021.acl-short.34
DOI:
10.18653/v1/2021.acl-short.34
Bibkey:
Cite (ACL):
Tahira Naseem, Srinivas Ravishankar, Nandana Mihindukulasooriya, Ibrahim Abdelaziz, Young-Suk Lee, Pavan Kapanipathi, Salim Roukos, Alfio Gliozzo, and Alexander Gray. 2021. A Semantics-aware Transformer Model of Relation Linking for Knowledge Base Question Answering. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 256–262, Online. Association for Computational Linguistics.
Cite (Informal):
A Semantics-aware Transformer Model of Relation Linking for Knowledge Base Question Answering (Naseem et al., ACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-short.34.pdf