WordNet Is All You Need: A Surprisingly Effective Unsupervised Method for Graded Lexical Entailment

Joseph Renner, Pascal Denis, Rémi Gilleron


Abstract
We propose a simple unsupervised approach which exclusively relies on WordNet (Miller,1995) for predicting graded lexical entailment (GLE) in English. Inspired by the seminal work of Resnik (1995), our method models GLE as the sum of two information-theoretic scores: a symmetric semantic similarity score and an asymmetric specificity loss score, both exploiting the hierarchical synset structure of WordNet. Our approach also includes a simple disambiguation mechanism to handle polysemy in a given word pair. Despite its simplicity, our method achieves performance above the state of the art (Spearman 𝜌 = 0.75) on HyperLex (Vulic et al., 2017), the largest GLE dataset, outperforming all previous methods, including specialized word embeddings approaches that use WordNet as weak supervision.
Anthology ID:
2023.findings-emnlp.615
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9176–9182
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.615
DOI:
10.18653/v1/2023.findings-emnlp.615
Bibkey:
Cite (ACL):
Joseph Renner, Pascal Denis, and Rémi Gilleron. 2023. WordNet Is All You Need: A Surprisingly Effective Unsupervised Method for Graded Lexical Entailment. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 9176–9182, Singapore. Association for Computational Linguistics.
Cite (Informal):
WordNet Is All You Need: A Surprisingly Effective Unsupervised Method for Graded Lexical Entailment (Renner et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.615.pdf