Context encoders as a simple but powerful extension of word2vec

Franziska Horn


Abstract
With a strikingly simple architecture and the ability to learn meaningful word embeddings efficiently from texts containing billions of words, word2vec remains one of the most popular neural language models used today. However, as only a single embedding is learned for every word in the vocabulary, the model fails to optimally represent words with multiple meanings and, additionally, it is not possible to create embeddings for new (out-of-vocabulary) words on the spot. Based on an intuitive interpretation of the continuous bag-of-words (CBOW) word2vec model’s negative sampling training objective in terms of predicting context based similarities, we motivate an extension of the model we call context encoders (ConEc). By multiplying the matrix of trained word2vec embeddings with a word’s average context vector, out-of-vocabulary (OOV) embeddings and representations for words with multiple meanings can be created based on the words’ local contexts. The benefits of this approach are illustrated by using these word embeddings as features in the CoNLL 2003 named entity recognition (NER) task.
Anthology ID:
W17-2602
Volume:
Proceedings of the 2nd Workshop on Representation Learning for NLP
Month:
August
Year:
2017
Address:
Vancouver, Canada
Editors:
Phil Blunsom, Antoine Bordes, Kyunghyun Cho, Shay Cohen, Chris Dyer, Edward Grefenstette, Karl Moritz Hermann, Laura Rimell, Jason Weston, Scott Yih
Venue:
RepL4NLP
SIG:
SIGREP
Publisher:
Association for Computational Linguistics
Note:
Pages:
10–14
Language:
URL:
https://aclanthology.org/W17-2602
DOI:
10.18653/v1/W17-2602
Bibkey:
Cite (ACL):
Franziska Horn. 2017. Context encoders as a simple but powerful extension of word2vec. In Proceedings of the 2nd Workshop on Representation Learning for NLP, pages 10–14, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Context encoders as a simple but powerful extension of word2vec (Horn, RepL4NLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/W17-2602.pdf
Code
 cod3licious/conec