Knowledge Graph Embedding by Adaptive Limit Scoring Loss Using Dynamic Weighting Strategy

Jinfa Yang, Xianghua Ying, Yongjie Shi, Xin Tong, Ruibin Wang, Taiyan Chen, Bowei Xing


Abstract
Knowledge graph embedding aims to represent entities and relations as low-dimensional vectors, which is an effective way for predicting missing links in knowledge graphs. Designing a strong and effective loss framework is essential for knowledge graph embedding models to distinguish between correct and incorrect triplets. The classic margin-based ranking loss limits the scores of positive and negative triplets to have a suitable margin. The recently proposed Limit-based Scoring Loss independently limits the range of positive and negative triplet scores. However, these loss frameworks use equal or fixed penalty terms to reduce the scores of positive and negative sample pairs, which is inflexible in optimization. Our intuition is that if a triplet score deviates far from the optimum, it should be emphasized. To this end, we propose Adaptive Limit Scoring Loss, which simply re-weights each triplet to highlight the less-optimized triplet scores. We apply this loss framework to several knowledge graph embedding models such as TransE, TransH and ComplEx. The experimental results on link prediction and triplet classification show that our proposed method has achieved performance on par with the state of the art.
Anthology ID:
2022.findings-acl.91
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1153–1163
Language:
URL:
https://aclanthology.org/2022.findings-acl.91
DOI:
10.18653/v1/2022.findings-acl.91
Bibkey:
Cite (ACL):
Jinfa Yang, Xianghua Ying, Yongjie Shi, Xin Tong, Ruibin Wang, Taiyan Chen, and Bowei Xing. 2022. Knowledge Graph Embedding by Adaptive Limit Scoring Loss Using Dynamic Weighting Strategy. In Findings of the Association for Computational Linguistics: ACL 2022, pages 1153–1163, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Knowledge Graph Embedding by Adaptive Limit Scoring Loss Using Dynamic Weighting Strategy (Yang et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.91.pdf
Data
FB15k-237