Latent semantic network induction in the context of linked example senses

Hunter Heidenreich, Jake Williams


Abstract
The Princeton WordNet is a powerful tool for studying language and developing natural language processing algorithms. With significant work developing it further, one line considers its extension through aligning its expert-annotated structure with other lexical resources. In contrast, this work explores a completely data-driven approach to network construction, forming a wordnet using the entirety of the open-source, noisy, user-annotated dictionary, Wiktionary. Comparing baselines to WordNet, we find compelling evidence that our network induction process constructs a network with useful semantic structure. With thousands of semantically-linked examples that demonstrate sense usage from basic lemmas to multiword expressions (MWEs), we believe this work motivates future research.
Anthology ID:
D19-5523
Volume:
Proceedings of the 5th Workshop on Noisy User-generated Text (W-NUT 2019)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Wei Xu, Alan Ritter, Tim Baldwin, Afshin Rahimi
Venue:
WNUT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
170–180
Language:
URL:
https://aclanthology.org/D19-5523
DOI:
10.18653/v1/D19-5523
Bibkey:
Cite (ACL):
Hunter Heidenreich and Jake Williams. 2019. Latent semantic network induction in the context of linked example senses. In Proceedings of the 5th Workshop on Noisy User-generated Text (W-NUT 2019), pages 170–180, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Latent semantic network induction in the context of linked example senses (Heidenreich & Williams, WNUT 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-5523.pdf