Always Keep your Target in Mind: Studying Semantics and Improving Performance of Neural Lexical Substitution

Nikolay Arefyev, Boris Sheludko, Alexander Podolskiy, Alexander Panchenko


Abstract
Lexical substitution, i.e. generation of plausible words that can replace a particular target word in a given context, is an extremely powerful technology that can be used as a backbone of various NLP applications, including word sense induction and disambiguation, lexical relation extraction, data augmentation, etc. In this paper, we present a large-scale comparative study of lexical substitution methods employing both rather old and most recent language and masked language models (LMs and MLMs), such as context2vec, ELMo, BERT, RoBERTa, XLNet. We show that already competitive results achieved by SOTA LMs/MLMs can be further substantially improved if information about the target word is injected properly. Several existing and new target word injection methods are compared for each LM/MLM using both intrinsic evaluation on lexical substitution datasets and extrinsic evaluation on word sense induction (WSI) datasets. On two WSI datasets we obtain new SOTA results. Besides, we analyze the types of semantic relations between target words and their substitutes generated by different models or given by annotators.
Anthology ID:
2020.coling-main.107
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
1242–1255
Language:
URL:
https://aclanthology.org/2020.coling-main.107
DOI:
10.18653/v1/2020.coling-main.107
Bibkey:
Cite (ACL):
Nikolay Arefyev, Boris Sheludko, Alexander Podolskiy, and Alexander Panchenko. 2020. Always Keep your Target in Mind: Studying Semantics and Improving Performance of Neural Lexical Substitution. In Proceedings of the 28th International Conference on Computational Linguistics, pages 1242–1255, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Always Keep your Target in Mind: Studying Semantics and Improving Performance of Neural Lexical Substitution (Arefyev et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.107.pdf
Code
 bsheludko/lexical-substitution