Relation Induction in Word Embeddings Revisited
Zied
Bouraoui
author
Shoaib
Jameel
author
Steven
Schockaert
author
2018-08
text
Proceedings of the 27th International Conference on Computational Linguistics
Emily
M
Bender
editor
Leon
Derczynski
editor
Pierre
Isabelle
editor
Association for Computational Linguistics
Santa Fe, New Mexico, USA
conference publication
Given a set of instances of some relation, the relation induction task is to predict which other word pairs are likely to be related in the same way. While it is natural to use word embeddings for this task, standard approaches based on vector translations turn out to perform poorly. To address this issue, we propose two probabilistic relation induction models. The first model is based on translations, but uses Gaussians to explicitly model the variability of these translations and to encode soft constraints on the source and target words that may be chosen. In the second model, we use Bayesian linear regression to encode the assumption that there is a linear relationship between the vector representations of related words, which is considerably weaker than the assumption underlying translation based models.
bouraoui-etal-2018-relation
https://aclanthology.org/C18-1138
2018-08
1627
1637