Lacking the Embedding of a Word? Look it up into a Traditional Dictionary

Elena Sofia Ruzzetti, Leonardo Ranaldi, Michele Mastromattei, Francesca Fallucchi, Noemi Scarpato, Fabio Massimo Zanzotto


Abstract
Word embeddings are powerful dictionaries, which may easily capture language variations. However, these dictionaries fail to give sense to rare words, which are surprisingly often covered by traditional dictionaries. In this paper, we propose to use definitions retrieved in traditional dictionaries to produce word embeddings for rare words. For this purpose, we introduce two methods: Definition Neural Network (DefiNNet) and Define BERT (DefBERT). In our experiments, DefiNNet and DefBERT significantly outperform state-of-the-art as well as baseline methods devised for producing embeddings of unknown words. In fact, DefiNNet significantly outperforms FastText, which implements a method for the same task-based on n-grams, and DefBERT significantly outperforms the BERT method for OOV words. Then, definitions in traditional dictionaries are useful to build word embeddings for rare words.
Anthology ID:
2022.findings-acl.208
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2651–2662
Language:
URL:
https://aclanthology.org/2022.findings-acl.208
DOI:
10.18653/v1/2022.findings-acl.208
Bibkey:
Cite (ACL):
Elena Sofia Ruzzetti, Leonardo Ranaldi, Michele Mastromattei, Francesca Fallucchi, Noemi Scarpato, and Fabio Massimo Zanzotto. 2022. Lacking the Embedding of a Word? Look it up into a Traditional Dictionary. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2651–2662, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Lacking the Embedding of a Word? Look it up into a Traditional Dictionary (Ruzzetti et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.208.pdf
Software:
 2022.findings-acl.208.software.zip
Video:
 https://aclanthology.org/2022.findings-acl.208.mp4