Noise-Robust Training with Dynamic Loss and Contrastive Learning for Distantly-Supervised Named Entity Recognition

Zhiyuan Ma, Jintao Du, Shuheng Zhou


Abstract
Distantly-supervised named entity recognition (NER) aims at training networks with distantly-labeled data, which is automatically obtained by matching entity mentions in the raw text with entity types in a knowledge base. Distant supervision may induce incomplete and noisy labels, so recent state-of-the-art methods employ sample selection mechanism to separate clean data from noisy data based on the model’s prediction scores. However, they ignore the noise distribution change caused by data selection, and they simply excludes noisy data during training, resulting in information loss. We propose to (1) use a dynamic loss function to better adapt to the changing noise during the training process, and (2) incorporate token level contrastive learning to fully utilize the noisy data as well as facilitate feature learning without relying on labels. Our method achieves superior performance on three benchmark datasets, outperforming existing distantly supervised NER models by significant margins.
Anthology ID:
2023.findings-acl.643
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10119–10128
Language:
URL:
https://aclanthology.org/2023.findings-acl.643
DOI:
10.18653/v1/2023.findings-acl.643
Bibkey:
Cite (ACL):
Zhiyuan Ma, Jintao Du, and Shuheng Zhou. 2023. Noise-Robust Training with Dynamic Loss and Contrastive Learning for Distantly-Supervised Named Entity Recognition. In Findings of the Association for Computational Linguistics: ACL 2023, pages 10119–10128, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Noise-Robust Training with Dynamic Loss and Contrastive Learning for Distantly-Supervised Named Entity Recognition (Ma et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.643.pdf
Video:
 https://aclanthology.org/2023.findings-acl.643.mp4