Obtaining Better Static Word Embeddings Using Contextual Embedding Models

Prakhar Gupta, Martin Jaggi


Abstract
The advent of contextual word embeddings — representations of words which incorporate semantic and syntactic information from their context—has led to tremendous improvements on a wide variety of NLP tasks. However, recent contextual models have prohibitively high computational cost in many use-cases and are often hard to interpret. In this work, we demonstrate that our proposed distillation method, which is a simple extension of CBOW-based training, allows to significantly improve computational efficiency of NLP applications, while outperforming the quality of existing static embeddings trained from scratch as well as those distilled from previously proposed methods. As a side-effect, our approach also allows a fair comparison of both contextual and static embeddings via standard lexical evaluation tasks.
Anthology ID:
2021.acl-long.408
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5241–5253
Language:
URL:
https://aclanthology.org/2021.acl-long.408
DOI:
10.18653/v1/2021.acl-long.408
Bibkey:
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.408.pdf