Improving Self-training for Cross-lingual Named Entity Recognition with Contrastive and Prototype Learning

Ran Zhou, Xin Li, Lidong Bing, Erik Cambria, Chunyan Miao


Abstract
In cross-lingual named entity recognition (NER), self-training is commonly used to bridge the linguistic gap by training on pseudo-labeled target-language data. However, due to sub-optimal performance on target languages, the pseudo labels are often noisy and limit the overall performance. In this work, we aim to improve self-training for cross-lingual NER by combining representation learning and pseudo label refinement in one coherent framework. Our proposed method, namely ContProto mainly comprises two components: (1) contrastive self-training and (2) prototype-based pseudo-labeling. Our contrastive self-training facilitates span classification by separating clusters of different classes, and enhances cross-lingual transferability by producing closely-aligned representations between the source and target language. Meanwhile, prototype-based pseudo-labeling effectively improves the accuracy of pseudo labels during training. We evaluate ContProto on multiple transfer pairs, and experimental results show our method brings substantial improvements over current state-of-the-art methods.
Anthology ID:
2023.acl-long.222
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4018–4031
Language:
URL:
https://aclanthology.org/2023.acl-long.222
DOI:
10.18653/v1/2023.acl-long.222
Bibkey:
Cite (ACL):
Ran Zhou, Xin Li, Lidong Bing, Erik Cambria, and Chunyan Miao. 2023. Improving Self-training for Cross-lingual Named Entity Recognition with Contrastive and Prototype Learning. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4018–4031, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Improving Self-training for Cross-lingual Named Entity Recognition with Contrastive and Prototype Learning (Zhou et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.222.pdf
Video:
 https://aclanthology.org/2023.acl-long.222.mp4