Keisuke Kawano
2022
Enhancing Contextual Word Representations Using Embedding of Neighboring Entities in Knowledge Graphs
Ryoko Tokuhisa
|
Keisuke Kawano
|
Akihiro Nakamura
|
Satoshi Koide
Proceedings of the 29th International Conference on Computational Linguistics
Pre-trained language models (PLMs) such as BERT and RoBERTa have dramatically improved the performance of various natural language processing tasks. Although these models are trained on large amounts of raw text, they have no explicit grounding in real-world entities. Knowledge graphs (KGs) are manually annotated with factual knowledge and store the relations between nodes corresponding to entities as labeled edges. This paper proposes a mechanism called KG-attention, which integrates the structure of a KG into recent PLM architectures. Unlike the existing PLM+KG integration methods, KG-attention generalizes the embeddings of neighboring entities using the relation embeddings; accordingly, it can handle relations between unconnected entities in the KG. Experimental results demonstrated that our method achieved significant improvements in a relation classification task, an entity typing task, and several language comprehension tasks.
Search