Enhancing Contextual Word Representations Using Embedding of Neighboring Entities in Knowledge Graphs

Ryoko Tokuhisa, Keisuke Kawano, Akihiro Nakamura, Satoshi Koide


Abstract
Pre-trained language models (PLMs) such as BERT and RoBERTa have dramatically improved the performance of various natural language processing tasks. Although these models are trained on large amounts of raw text, they have no explicit grounding in real-world entities. Knowledge graphs (KGs) are manually annotated with factual knowledge and store the relations between nodes corresponding to entities as labeled edges. This paper proposes a mechanism called KG-attention, which integrates the structure of a KG into recent PLM architectures. Unlike the existing PLM+KG integration methods, KG-attention generalizes the embeddings of neighboring entities using the relation embeddings; accordingly, it can handle relations between unconnected entities in the KG. Experimental results demonstrated that our method achieved significant improvements in a relation classification task, an entity typing task, and several language comprehension tasks.
Anthology ID:
2022.coling-1.281
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3175–3186
Language:
URL:
https://aclanthology.org/2022.coling-1.281
DOI:
Bibkey:
Cite (ACL):
Ryoko Tokuhisa, Keisuke Kawano, Akihiro Nakamura, and Satoshi Koide. 2022. Enhancing Contextual Word Representations Using Embedding of Neighboring Entities in Knowledge Graphs. In Proceedings of the 29th International Conference on Computational Linguistics, pages 3175–3186, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Enhancing Contextual Word Representations Using Embedding of Neighboring Entities in Knowledge Graphs (Tokuhisa et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.281.pdf
Data
GLUEOpen EntityTACRED