Unsupervised Lexical Substitution with Decontextualised Embeddings

Takashi Wada, Timothy Baldwin, Yuji Matsumoto, Jey Han Lau


Abstract
We propose a new unsupervised method for lexical substitution using pre-trained language models. Compared to previous approaches that use the generative capability of language models to predict substitutes, our method retrieves substitutes based on the similarity of contextualised and decontextualised word embeddings, i.e. the average contextual representation of a word in multiple contexts. We conduct experiments in English and Italian, and show that our method substantially outperforms strong baselines and establishes a new state-of-the-art without any explicit supervision or fine-tuning. We further show that our method performs particularly well at predicting low-frequency substitutes, and also generates a diverse list of substitute candidates, reducing morphophonetic or morphosyntactic biases induced by article-noun agreement.
Anthology ID:
2022.coling-1.366
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
4172–4185
Language:
URL:
https://aclanthology.org/2022.coling-1.366
DOI:
Bibkey:
Cite (ACL):
Takashi Wada, Timothy Baldwin, Yuji Matsumoto, and Jey Han Lau. 2022. Unsupervised Lexical Substitution with Decontextualised Embeddings. In Proceedings of the 29th International Conference on Computational Linguistics, pages 4172–4185, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Unsupervised Lexical Substitution with Decontextualised Embeddings (Wada et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.366.pdf
Code
 twadada/lexsub_decontextualised
Data
OSCARSwords