KLMo: Knowledge Graph Enhanced Pretrained Language Model with Fine-Grained Relationships

Lei He, Suncong Zheng, Tao Yang, Feng Zhang


Abstract
Interactions between entities in knowledge graph (KG) provide rich knowledge for language representation learning. However, existing knowledge-enhanced pretrained language models (PLMs) only focus on entity information and ignore the fine-grained relationships between entities. In this work, we propose to incorporate KG (including both entities and relations) into the language learning process to obtain KG-enhanced pretrained Language Model, namely KLMo. Specifically, a novel knowledge aggregator is designed to explicitly model the interaction between entity spans in text and all entities and relations in a contextual KG. An relation prediction objective is utilized to incorporate relation information by distant supervision. An entity linking objective is further utilized to link entity spans in text to entities in KG. In this way, the structured knowledge can be effectively integrated into language representations. Experimental results demonstrate that KLMo achieves great improvements on several knowledge-driven tasks, such as entity typing and relation classification, comparing with the state-of-the-art knowledge-enhanced PLMs.
Anthology ID:
2021.findings-emnlp.384
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4536–4542
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.384
DOI:
10.18653/v1/2021.findings-emnlp.384
Bibkey:
Cite (ACL):
Lei He, Suncong Zheng, Tao Yang, and Feng Zhang. 2021. KLMo: Knowledge Graph Enhanced Pretrained Language Model with Fine-Grained Relationships. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 4536–4542, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
KLMo: Knowledge Graph Enhanced Pretrained Language Model with Fine-Grained Relationships (He et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.384.pdf
Code
 lei-nlp/klmo