An Unsupervised Multiple-Task and Multiple-Teacher Model for Cross-lingual Named Entity Recognition

Zhuoran Li, Chunming Hu, Xiaohui Guo, Junfan Chen, Wenyi Qin, Richong Zhang


Abstract
Cross-lingual named entity recognition task is one of the critical problems for evaluating the potential transfer learning techniques on low resource languages. Knowledge distillation using pre-trained multilingual language models between source and target languages have shown their superiority in transfer. However, existing cross-lingual distillation models merely consider the potential transferability between two identical single tasks across both domains. Other possible auxiliary tasks to improve the learning performance have not been fully investigated. In this study, based on the knowledge distillation framework and multi-task learning, we introduce the similarity metric model as an auxiliary task to improve the cross-lingual NER performance on the target domain. Specifically, an entity recognizer and a similarity evaluator are first trained in parallel as two teachers from the source domain. Then, two tasks in the student model are supervised by these teachers simultaneously. Empirical studies on the three datasets across 7 different languages confirm the effectiveness of the proposed model.
Anthology ID:
2022.acl-long.14
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
170–179
Language:
URL:
https://aclanthology.org/2022.acl-long.14
DOI:
10.18653/v1/2022.acl-long.14
Bibkey:
Cite (ACL):
Zhuoran Li, Chunming Hu, Xiaohui Guo, Junfan Chen, Wenyi Qin, and Richong Zhang. 2022. An Unsupervised Multiple-Task and Multiple-Teacher Model for Cross-lingual Named Entity Recognition. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 170–179, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
An Unsupervised Multiple-Task and Multiple-Teacher Model for Cross-lingual Named Entity Recognition (Li et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.14.pdf
Software:
 2022.acl-long.14.software.zip
Data
CoNLL-2003