Word Sense Filtering Improves Embedding-Based Lexical Substitution

Anne Cocos, Marianna Apidianaki, Chris Callison-Burch


Abstract
The role of word sense disambiguation in lexical substitution has been questioned due to the high performance of vector space models which propose good substitutes without explicitly accounting for sense. We show that a filtering mechanism based on a sense inventory optimized for substitutability can improve the results of these models. Our sense inventory is constructed using a clustering method which generates paraphrase clusters that are congruent with lexical substitution annotations in a development set. The results show that lexical substitution can still benefit from senses which can improve the output of vector space paraphrase ranking models.
Anthology ID:
W17-1914
Volume:
Proceedings of the 1st Workshop on Sense, Concept and Entity Representations and their Applications
Month:
April
Year:
2017
Address:
Valencia, Spain
Editors:
Jose Camacho-Collados, Mohammad Taher Pilehvar
Venue:
SENSE
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
110–119
Language:
URL:
https://aclanthology.org/W17-1914
DOI:
10.18653/v1/W17-1914
Bibkey:
Cite (ACL):
Anne Cocos, Marianna Apidianaki, and Chris Callison-Burch. 2017. Word Sense Filtering Improves Embedding-Based Lexical Substitution. In Proceedings of the 1st Workshop on Sense, Concept and Entity Representations and their Applications, pages 110–119, Valencia, Spain. Association for Computational Linguistics.
Cite (Informal):
Word Sense Filtering Improves Embedding-Based Lexical Substitution (Cocos et al., SENSE 2017)
Copy Citation:
PDF:
https://aclanthology.org/W17-1914.pdf