Query2Triple: Unified Query Encoding for Answering Diverse Complex Queries over Knowledge Graphs

Yao Xu, Shizhu He, Cunguang Wang, Li Cai, Kang Liu, Jun Zhao


Abstract
Complex Query Answering (CQA) is a challenge task of Knowledge Graph (KG). Due to the incompleteness of KGs, query embedding (QE) methods have been proposed to encode queries and entities into the same embedding space, and treat logical operators as neural set operators to obtain answers. However, these methods train KG embeddings and neural set operators concurrently on both simple (one-hop) and complex (multi-hop and logical) queries, which causes performance degradation on simple queries and low training efficiency. In this paper, we propose Query to Triple (Q2T), a novel approach that decouples the training for simple and complex queries. Q2T divides the training into two stages: (1) Pre-training the neural link predictor on simple queries to predict tail entities based on the head entity and relation. (2) Training the query encoder on complex queries to encode diverse complex queries into a unified triple form that can be efficiently solved by the pretrained link predictor. Our proposed Q2T is not only efficient to train, but also modular, thus easily adaptable to various neural link predictors that have been studied well. Extensive experiments demonstrate that, even without explicit modeling for neural set operators, Q2T still achieves state-of-the-art performance on diverse complex queries over three public benchmarks.
Anthology ID:
2023.findings-emnlp.761
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11369–11382
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.761
DOI:
10.18653/v1/2023.findings-emnlp.761
Bibkey:
Cite (ACL):
Yao Xu, Shizhu He, Cunguang Wang, Li Cai, Kang Liu, and Jun Zhao. 2023. Query2Triple: Unified Query Encoding for Answering Diverse Complex Queries over Knowledge Graphs. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11369–11382, Singapore. Association for Computational Linguistics.
Cite (Informal):
Query2Triple: Unified Query Encoding for Answering Diverse Complex Queries over Knowledge Graphs (Xu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.761.pdf