Improving Knowledge Graph Embedding Using Affine Transformations of Entities Corresponding to Each Relation

Jinfa Yang, Yongjie Shi, Xin Tong, Robin Wang, Taiyan Chen, Xianghua Ying


Abstract
To find a suitable embedding for a knowledge graph remains a big challenge nowadays. By using previous knowledge graph embedding methods, every entity in a knowledge graph is usually represented as a k-dimensional vector. As we know, an affine transformation can be expressed in the form of a matrix multiplication followed by a translation vector. In this paper, we firstly utilize a set of affine transformations related to each relation to operate on entity vectors, and then these transformed vectors are used for performing embedding with previous methods. The main advantage of using affine transformations is their good geometry properties with interpretability. Our experimental results demonstrate that the proposed intuitive design with affine transformations provides a statistically significant increase in performance with adding a few extra processing steps or adding a limited number of additional variables. Taking TransE as an example, we employ the scale transformation (the special case of an affine transformation), and only introduce k additional variables for each relation. Surprisingly, it even outperforms RotatE to some extent on various data sets. We also introduce affine transformations into RotatE, Distmult and ComplEx, respectively, and each one outperforms its original method.
Anthology ID:
2021.findings-emnlp.46
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
508–517
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.46
DOI:
10.18653/v1/2021.findings-emnlp.46
Bibkey:
Cite (ACL):
Jinfa Yang, Yongjie Shi, Xin Tong, Robin Wang, Taiyan Chen, and Xianghua Ying. 2021. Improving Knowledge Graph Embedding Using Affine Transformations of Entities Corresponding to Each Relation. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 508–517, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Improving Knowledge Graph Embedding Using Affine Transformations of Entities Corresponding to Each Relation (Yang et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.46.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.46.mp4
Data
FB15k-237YAGO