Joint Unsupervised Learning of Semantic Representation of Words and Roles in Dependency Trees

Michal Konkol


Abstract
In this paper, we introduce WoRel, a model that jointly learns word embeddings and a semantic representation of word relations. The model learns from plain text sentences and their dependency parse trees. The word embeddings produced by WoRel outperform Skip-Gram and GloVe in word similarity and syntactical word analogy tasks and have comparable results on word relatedness and semantic word analogy tasks. We show that the semantic representation of relations enables us to express the meaning of phrases and is a promising research direction for semantics at the sentence level.
Anthology ID:
R17-1052
Volume:
Proceedings of the International Conference Recent Advances in Natural Language Processing, RANLP 2017
Month:
September
Year:
2017
Address:
Varna, Bulgaria
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
394–400
Language:
URL:
https://doi.org/10.26615/978-954-452-049-6_052
DOI:
10.26615/978-954-452-049-6_052
Bibkey:
Cite (ACL):
Michal Konkol. 2017. Joint Unsupervised Learning of Semantic Representation of Words and Roles in Dependency Trees. In Proceedings of the International Conference Recent Advances in Natural Language Processing, RANLP 2017, pages 394–400, Varna, Bulgaria. INCOMA Ltd..
Cite (Informal):
Joint Unsupervised Learning of Semantic Representation of Words and Roles in Dependency Trees (Konkol, RANLP 2017)
Copy Citation:
PDF:
https://doi.org/10.26615/978-954-452-049-6_052