Quantum-Inspired Complex Word Embedding

Qiuchi Li, Sagar Uprety, Benyou Wang, Dawei Song


Abstract
A challenging task for word embeddings is to capture the emergent meaning or polarity of a combination of individual words. For example, existing approaches in word embeddings will assign high probabilities to the words “Penguin” and “Fly” if they frequently co-occur, but it fails to capture the fact that they occur in an opposite sense - Penguins do not fly. We hypothesize that humans do not associate a single polarity or sentiment to each word. The word contributes to the overall polarity of a combination of words depending upon which other words it is combined with. This is analogous to the behavior of microscopic particles which exist in all possible states at the same time and interfere with each other to give rise to new states depending upon their relative phases. We make use of the Hilbert Space representation of such particles in Quantum Mechanics where we subscribe a relative phase to each word, which is a complex number, and investigate two such quantum inspired models to derive the meaning of a combination of words. The proposed models achieve better performances than state-of-the-art non-quantum models on binary sentence classification tasks.
Anthology ID:
W18-3006
Volume:
Proceedings of the Third Workshop on Representation Learning for NLP
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Isabelle Augenstein, Kris Cao, He He, Felix Hill, Spandana Gella, Jamie Kiros, Hongyuan Mei, Dipendra Misra
Venue:
RepL4NLP
SIG:
SIGREP
Publisher:
Association for Computational Linguistics
Note:
Pages:
50–57
Language:
URL:
https://aclanthology.org/W18-3006
DOI:
10.18653/v1/W18-3006
Bibkey:
Cite (ACL):
Qiuchi Li, Sagar Uprety, Benyou Wang, and Dawei Song. 2018. Quantum-Inspired Complex Word Embedding. In Proceedings of the Third Workshop on Representation Learning for NLP, pages 50–57, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Quantum-Inspired Complex Word Embedding (Li et al., RepL4NLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-3006.pdf
Data
MPQA Opinion Corpus