Improving Named Entity Recognition via Bridge-based Domain Adaptation

Jingyun Xu, Changmeng Zheng, Yi Cai, Tat-Seng Chua


Abstract
Recent studies have shown remarkable success in cross-domain named entity recognition (cross-domain NER). Despite the promising results, existing methods mainly utilize pre-training language models like BERT to represent words. As such, the original chaotic representations may challenge them to distinguish entity types of entities, leading to entity type misclassification. To this end, we attempt to utilize contrastive learning to refine the original representations and propose a model-agnostic framework named MoCL for cross-domain NER. Additionally, we respectively combine MoCL with two distinctive cross-domain NER methods and two pre-training language models to explore its generalization ability. Empirical results on seven domains show the effectiveness and good generalization ability of MoCL.
Anthology ID:
2023.findings-acl.238
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3869–3882
Language:
URL:
https://aclanthology.org/2023.findings-acl.238
DOI:
10.18653/v1/2023.findings-acl.238
Bibkey:
Cite (ACL):
Jingyun Xu, Changmeng Zheng, Yi Cai, and Tat-Seng Chua. 2023. Improving Named Entity Recognition via Bridge-based Domain Adaptation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 3869–3882, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Improving Named Entity Recognition via Bridge-based Domain Adaptation (Xu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.238.pdf