ACENet: Attention Guided Commonsense Reasoning on Hybrid Knowledge Graph

Chuzhan Hao, Minghui Xie, Peng Zhang


Abstract
Augmenting pre-trained language models (PLMs) with knowledge graphs (KGs) has demonstrated superior performance on commonsense reasoning. Given a commonsense based QA context (question and multiple choices), existing approaches usually estimate the plausibility of candidate choices separately based on their respective retrieved KGs, without considering the interference among different choices. In this paper, we propose an Attention guided Commonsense rEasoning Network (ACENet) to endow the neural network with the capability of integrating hybrid knowledge. Specifically, our model applies the multi-layer interaction of answer choices to continually strengthen correct choice information and guide the message passing of GNN. In addition, we also design a mix attention mechanism of nodes and edges to iteratively select supporting evidence on hybrid knowledge graph. Experimental results demonstrate the effectiveness of our proposed model through considerable performance gains across CommonsenseQA and OpenbookQA datasets.
Anthology ID:
2022.emnlp-main.579
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8461–8471
Language:
URL:
https://aclanthology.org/2022.emnlp-main.579
DOI:
10.18653/v1/2022.emnlp-main.579
Bibkey:
Cite (ACL):
Chuzhan Hao, Minghui Xie, and Peng Zhang. 2022. ACENet: Attention Guided Commonsense Reasoning on Hybrid Knowledge Graph. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 8461–8471, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
ACENet: Attention Guided Commonsense Reasoning on Hybrid Knowledge Graph (Hao et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.579.pdf