BERT-MK: Integrating Graph Contextualized Knowledge into Pre-trained Language Models

Bin He, Di Zhou, Jinghui Xiao, Xin Jiang, Qun Liu, Nicholas Jing Yuan, Tong Xu


Abstract
Complex node interactions are common in knowledge graphs (KGs), and these interactions can be considered as contextualized knowledge exists in the topological structure of KGs. Traditional knowledge representation learning (KRL) methods usually treat a single triple as a training unit, neglecting the usage of graph contextualized knowledge. To utilize these unexploited graph-level knowledge, we propose an approach to model subgraphs in a medical KG. Then, the learned knowledge is integrated with a pre-trained language model to do the knowledge generalization. Experimental results demonstrate that our model achieves the state-of-the-art performance on several medical NLP tasks, and the improvement above MedERNIE indicates that graph contextualized knowledge is beneficial.
Anthology ID:
2020.findings-emnlp.207
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Venues:
EMNLP | Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2281–2290
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.207
DOI:
10.18653/v1/2020.findings-emnlp.207
Bibkey:
Cite (ACL):
Bin He, Di Zhou, Jinghui Xiao, Xin Jiang, Qun Liu, Nicholas Jing Yuan, and Tong Xu. 2020. BERT-MK: Integrating Graph Contextualized Knowledge into Pre-trained Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 2281–2290, Online. Association for Computational Linguistics.
Cite (Informal):
BERT-MK: Integrating Graph Contextualized Knowledge into Pre-trained Language Models (He et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.207.pdf
Data
BC5CDR