Learning Lexical Subspaces in a Distributional Vector Space

Kushal Arora, Aishik Chakraborty, Jackie C. K. Cheung


Abstract
In this paper, we propose LexSub, a novel approach towards unifying lexical and distributional semantics. We inject knowledge about lexical-semantic relations into distributional word embeddings by defining subspaces of the distributional vector space in which a lexical relation should hold. Our framework can handle symmetric attract and repel relations (e.g., synonymy and antonymy, respectively), as well as asymmetric relations (e.g., hypernymy and meronomy). In a suite of intrinsic benchmarks, we show that our model outperforms previous approaches on relatedness tasks and on hypernymy classification and detection, while being competitive on word similarity tasks. It also outperforms previous systems on extrinsic classification tasks that benefit from exploiting lexical relational cues. We perform a series of analyses to understand the behaviors of our model.1Code available at https://github.com/aishikchakraborty/LexSub.
Anthology ID:
2020.tacl-1.21
Volume:
Transactions of the Association for Computational Linguistics, Volume 8
Month:
Year:
2020
Address:
Cambridge, MA
Editors:
Mark Johnson, Brian Roark, Ani Nenkova
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
311–329
Language:
URL:
https://aclanthology.org/2020.tacl-1.21
DOI:
10.1162/tacl_a_00316
Bibkey:
Cite (ACL):
Kushal Arora, Aishik Chakraborty, and Jackie C. K. Cheung. 2020. Learning Lexical Subspaces in a Distributional Vector Space. Transactions of the Association for Computational Linguistics, 8:311–329.
Cite (Informal):
Learning Lexical Subspaces in a Distributional Vector Space (Arora et al., TACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.tacl-1.21.pdf
Code
 aishikchakraborty/LexSub