Ultra-Fine Entity Typing with Prior Knowledge about Labels: A Simple Clustering Based Strategy

Na Li, Zied Bouraoui, Steven Schockaert


Abstract
Ultra-fine entity typing (UFET) is the task of inferring the semantic types from a large set of fine-grained candidates that apply to a given entity mention. This task is especially challenging because we only have a small number of training examples for many types, even with distant supervision strategies. State-of-the-art models, therefore, have to rely on prior knowledge about the type labels in some way. In this paper, we show that the performance of existing methods can be improved using a simple technique: we use pre-trained label embeddings to cluster the labels into semantic domains and then treat these domains as additional types. We show that this strategy consistently leads to improved results as long as high-quality label embeddings are used. Furthermore, we use the label clusters as part of a simple post-processing technique, which results in further performance gains. Both strategies treat the UFET model as a black box and can thus straightforwardly be used to improve a wide range of existing models.
Anthology ID:
2023.findings-emnlp.786
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11744–11756
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.786
DOI:
10.18653/v1/2023.findings-emnlp.786
Bibkey:
Cite (ACL):
Na Li, Zied Bouraoui, and Steven Schockaert. 2023. Ultra-Fine Entity Typing with Prior Knowledge about Labels: A Simple Clustering Based Strategy. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11744–11756, Singapore. Association for Computational Linguistics.
Cite (Informal):
Ultra-Fine Entity Typing with Prior Knowledge about Labels: A Simple Clustering Based Strategy (Li et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.786.pdf