A Copy Mechanism for Handling Knowledge Base Elements in SPARQL Neural Machine Translation

Rose Hirigoyen, Amal Zouaq, Samuel Reyd


Abstract
Neural Machine Translation (NMT) models from English to SPARQL are a promising development for SPARQL query generation. However, current architectures are unable to integrate the knowledge base (KB) schema and handle questions on knowledge resources, classes, and properties unseen during training, rendering them unusable outside the scope of topics covered in the training set. Inspired by the performance gains in natural language processing tasks, we propose to integrate a copy mechanism for neural SPARQL query generation as a way to tackle this issue. We illustrate our proposal by adding a copy layer and a dynamic knowledge base vocabulary to two Seq2Seq architectures (CNNs and Transformers). This layer makes the models copy KB elements directly from the questions, instead of generating them. We evaluate our approach on state-of-the-art datasets, including datasets referencing unknown KB elements and measure the accuracy of the copy-augmented architectures. Our results show a considerable increase in performance on all datasets compared to non-copy architectures.
Anthology ID:
2022.findings-aacl.22
Volume:
Findings of the Association for Computational Linguistics: AACL-IJCNLP 2022
Month:
November
Year:
2022
Address:
Online only
Editors:
Yulan He, Heng Ji, Sujian Li, Yang Liu, Chua-Hui Chang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
226–236
Language:
URL:
https://aclanthology.org/2022.findings-aacl.22
DOI:
Bibkey:
Cite (ACL):
Rose Hirigoyen, Amal Zouaq, and Samuel Reyd. 2022. A Copy Mechanism for Handling Knowledge Base Elements in SPARQL Neural Machine Translation. In Findings of the Association for Computational Linguistics: AACL-IJCNLP 2022, pages 226–236, Online only. Association for Computational Linguistics.
Cite (Informal):
A Copy Mechanism for Handling Knowledge Base Elements in SPARQL Neural Machine Translation (Hirigoyen et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-aacl.22.pdf