Random Entity Quantization for Parameter-Efficient Compositional Knowledge Graph Representation

Jiaang Li, Quan Wang, Yi Liu, Licheng Zhang, Zhendong Mao


Abstract
Representation Learning on Knowledge Graphs (KGs) is essential for downstream tasks. The dominant approach, KG Embedding (KGE), represents entities with independent vectors and faces the scalability challenge. Recent studies propose an alternative way for parameter efficiency, which represents entities by composing entity-corresponding codewords matched from predefined small-scale codebooks. We refer to the process of obtaining corresponding codewords of each entity as entity quantization, for which previous works have designed complicated strategies. Surprisingly, this paper shows that simple random entity quantization can achieve similar results to current strategies. We analyze this phenomenon and reveal that entity codes, the quantization outcomes for expressing entities, have higher entropy at the code level and Jaccard distance at the codeword level under random entity quantization. Therefore, different entities become more easily distinguished, facilitating effective KG representation. The above results show that current quantization strategies are not critical for KG representation, and there is still room for improvement in entity distinguishability beyond current strategies.
Anthology ID:
2023.emnlp-main.177
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2917–2928
Language:
URL:
https://aclanthology.org/2023.emnlp-main.177
DOI:
10.18653/v1/2023.emnlp-main.177
Bibkey:
Cite (ACL):
Jiaang Li, Quan Wang, Yi Liu, Licheng Zhang, and Zhendong Mao. 2023. Random Entity Quantization for Parameter-Efficient Compositional Knowledge Graph Representation. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 2917–2928, Singapore. Association for Computational Linguistics.
Cite (Informal):
Random Entity Quantization for Parameter-Efficient Compositional Knowledge Graph Representation (Li et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.177.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.177.mp4