ATAP: Automatic Template-Augmented Commonsense Knowledge Graph Completion via Pre-Trained Language Models

Fu Zhang, Yifan Ding, Jingwei Cheng


Abstract
The mission of commonsense knowledge graph completion (CKGC) is to infer missing facts from known commonsense knowledge. CKGC methods can be roughly divided into two categories: triple-based methods and text-based methods. Due to the imbalanced distribution of entities and limited structural information, triple-based methods struggle with long-tail entities. Text-based methods alleviate this issue, but require extensive training and fine-tuning of language models, which reduces efficiency. To alleviate these problems, we propose ATAP, the first CKGC framework that utilizes automatically generated continuous prompt templates combined with pre-trained language models (PLMs). Moreover, ATAP uses a carefully designed new prompt template training strategy, guiding PLMs to generate optimal prompt templates for CKGC tasks. Combining the rich knowledge of PLMs with the template automatic augmentation strategy, ATAP effectively mitigates the long-tail problem and enhances CKGC performance. Results on benchmark datasets show that ATAP achieves state-of-the-art performance overall.
Anthology ID:
2024.emnlp-main.919
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16456–16472
Language:
URL:
https://aclanthology.org/2024.emnlp-main.919
DOI:
10.18653/v1/2024.emnlp-main.919
Bibkey:
Cite (ACL):
Fu Zhang, Yifan Ding, and Jingwei Cheng. 2024. ATAP: Automatic Template-Augmented Commonsense Knowledge Graph Completion via Pre-Trained Language Models. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 16456–16472, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
ATAP: Automatic Template-Augmented Commonsense Knowledge Graph Completion via Pre-Trained Language Models (Zhang et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.919.pdf