Enhancing Word Embeddings with Knowledge Extracted from Lexical Resources

Magdalena Biesialska, Bardia Rafieian, Marta R. Costa-jussà


Abstract
In this work, we present an effective method for semantic specialization of word vector representations. To this end, we use traditional word embeddings and apply specialization methods to better capture semantic relations between words. In our approach, we leverage external knowledge from rich lexical resources such as BabelNet. We also show that our proposed post-specialization method based on an adversarial neural network with the Wasserstein distance allows to gain improvements over state-of-the-art methods on two tasks: word similarity and dialog state tracking.
Anthology ID:
2020.acl-srw.36
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
Month:
July
Year:
2020
Address:
Online
Editors:
Shruti Rijhwani, Jiangming Liu, Yizhong Wang, Rotem Dror
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
271–278
Language:
URL:
https://aclanthology.org/2020.acl-srw.36
DOI:
10.18653/v1/2020.acl-srw.36
Bibkey:
Cite (ACL):
Magdalena Biesialska, Bardia Rafieian, and Marta R. Costa-jussà. 2020. Enhancing Word Embeddings with Knowledge Extracted from Lexical Resources. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 271–278, Online. Association for Computational Linguistics.
Cite (Informal):
Enhancing Word Embeddings with Knowledge Extracted from Lexical Resources (Biesialska et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-srw.36.pdf
Video:
 http://slideslive.com/38928680
Code
 mbiesialska/wgan-postspec