RPD: A Distance Function Between Word Embeddings

Xuhui Zhou, Shujian Huang, Zaixiang Zheng


Abstract
It is well-understood that different algorithms, training processes, and corpora produce different word embeddings. However, less is known about the relation between different embedding spaces, i.e. how far different sets of em-beddings deviate from each other. In this paper, we propose a novel metric called Relative Pairwise Inner Product Distance (RPD) to quantify the distance between different sets of word embeddings. This unitary-invariant metric has a unified scale for comparing different sets of word embeddings. Based on the properties of RPD, we study the relations of word embeddings of different algorithms systematically and investigate the influence of different training processes and corpora. The results shed light on the poorly understood word embeddings and justify RPD as a measure of the distance of embedding space.
Anthology ID:
2020.acl-srw.7
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
Month:
July
Year:
2020
Address:
Online
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
42–50
Language:
URL:
https://aclanthology.org/2020.acl-srw.7
DOI:
10.18653/v1/2020.acl-srw.7
Bibkey:
Cite (ACL):
Xuhui Zhou, Shujian Huang, and Zaixiang Zheng. 2020. RPD: A Distance Function Between Word Embeddings. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 42–50, Online. Association for Computational Linguistics.
Cite (Informal):
RPD: A Distance Function Between Word Embeddings (Zhou et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-srw.7.pdf
Video:
 http://slideslive.com/38928645