A Comprehensive Comparison of Word Embeddings in Event & Entity Coreference Resolution.

Judicael Poumay, Ashwin Ittoo


Abstract
Coreference Resolution is an important NLP task and most state-of-the-art methods rely on word embeddings for word representation. However, one issue that has been largely overlooked in literature is that of comparing the performance of different embeddings across and within families. Therefore, we frame our study in the context of Event and Entity Coreference Resolution (EvCR & EnCR), and address two questions : 1) Is there a trade-off between performance (predictive and run-time) and embedding size? 2) How do the embeddings’ performance compare within and across families? Our experiments reveal several interesting findings. First, we observe diminishing returns in performance with respect to embedding size. E.g. a model using solely a character embedding achieves 86% of the performance of the largest model (Elmo, GloVe, Character) while being 1.2% of its size. Second, the larger models using multiple embeddings learns faster despite being slower per epoch. However, it is still slower at test time. Finally, Elmo performs best on both EvCR and EnCR, while GloVe and FastText perform best in EvCR and EnCR respectively.
Anthology ID:
2021.findings-emnlp.235
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2755–2764
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.235
DOI:
10.18653/v1/2021.findings-emnlp.235
Bibkey:
Cite (ACL):
Judicael Poumay and Ashwin Ittoo. 2021. A Comprehensive Comparison of Word Embeddings in Event & Entity Coreference Resolution.. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 2755–2764, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
A Comprehensive Comparison of Word Embeddings in Event & Entity Coreference Resolution. (Poumay & Ittoo, Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.235.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.235.mp4
Code
 judicaelpoumay/event_entity_coref_ecb_plus
Data
ECB+