Leveraging WordNet Paths for Neural Hypernym Prediction

Yejin Cho, Juan Diego Rodriguez, Yifan Gao, Katrin Erk


Abstract
We formulate the problem of hypernym prediction as a sequence generation task, where the sequences are taxonomy paths in WordNet. Our experiments with encoder-decoder models show that training to generate taxonomy paths can improve the performance of direct hypernym prediction. As a simple but powerful model, the hypo2path model achieves state-of-the-art performance, outperforming the best benchmark by 4.11 points in hit-at-one (H@1).
Anthology ID:
2020.coling-main.268
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3007–3018
Language:
URL:
https://aclanthology.org/2020.coling-main.268
DOI:
10.18653/v1/2020.coling-main.268
Bibkey:
Cite (ACL):
Yejin Cho, Juan Diego Rodriguez, Yifan Gao, and Katrin Erk. 2020. Leveraging WordNet Paths for Neural Hypernym Prediction. In Proceedings of the 28th International Conference on Computational Linguistics, pages 3007–3018, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Leveraging WordNet Paths for Neural Hypernym Prediction (Cho et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.268.pdf
Code
 scarletcho/hypernym-path-generation
Data
ConceptNetWN18WN18RR