Polar Embedding
Ran
Iwamoto
author
Ryosuke
Kohita
author
Akifumi
Wachi
author
2021-11
text
Proceedings of the 25th Conference on Computational Natural Language Learning
Arianna
Bisazza
editor
Omri
Abend
editor
Association for Computational Linguistics
Online
conference publication
Hierarchical relationships are invaluable information for many natural language processing (NLP) tasks. Distributional representation has become a fundamental approach for encoding word relationships, particularly embeddings in hyperbolic space showed great performance in representing hierarchies by taking advantage of their spatial properties. However, most machine learning systems do not suppose to use in such complex non-Euclidean geometries. To achieve hierarchy representations in commonly used Euclidean space, we propose Polar Embedding that learns word embeddings with the polar coordinate system. Utilizing characteristics of polar coordinates, the hierarchy of words is expressed with two independent variables: radius (generality) and angles (similarity), and their variables are optimized separately. Polar embedding shows word hierarchies explicitly and allows us to use beneficial resources such as word frequencies or word generality annotations for computing radiuses. We introduce an optimization method for learning angles in limited ranges of polar coordinates, which combining a loss function controlling gradient and distribution uniformization. Experimental results on hypernymy datasets indicate that our approach outperforms other embeddings in low-dimensional Euclidean space and competitively performs even with hyperbolic embeddings, which possess a geometric advantage.
iwamoto-etal-2021-polar
10.18653/v1/2021.conll-1.37
https://aclanthology.org/2021.conll-1.37
2021-11
470
480