Denoising Enhanced Distantly Supervised Ultrafine Entity Typing

Yue Zhang, Hongliang Fei, Ping Li


Abstract
Recently, the task of distantly supervised (DS) ultra-fine entity typing has received significant attention. However, DS data is noisy and often suffers from missing or wrong labeling issues resulting in low precision and low recall. This paper proposes a novel ultra-fine entity typing model with denoising capability. Specifically, we build a noise model to estimate the unknown labeling noise distribution over input contexts and noisy type labels. With the noise model, more trustworthy labels can be recovered by subtracting the estimated noise from the input. Furthermore, we propose an entity typing model, which adopts a bi-encoder architecture, is trained on the denoised data. Finally, the noise model and entity typing model are trained iteratively to enhance each other. We conduct extensive experiments on the Ultra-Fine entity typing dataset as well as OntoNotes dataset and demonstrate that our approach significantly outperforms other baseline methods.
Anthology ID:
2023.findings-acl.626
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9880–9892
Language:
URL:
https://aclanthology.org/2023.findings-acl.626
DOI:
10.18653/v1/2023.findings-acl.626
Bibkey:
Cite (ACL):
Yue Zhang, Hongliang Fei, and Ping Li. 2023. Denoising Enhanced Distantly Supervised Ultrafine Entity Typing. In Findings of the Association for Computational Linguistics: ACL 2023, pages 9880–9892, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Denoising Enhanced Distantly Supervised Ultrafine Entity Typing (Zhang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.626.pdf