Silp_nlp at SemEval-2023 Task 2: Cross-lingual Knowledge Transfer for Mono-lingual Learning

Sumit Singh, Uma Tiwary


Abstract
Our team silp_nlp participated in SemEval2023 Task 2: MultiCoNER II. Our work made systems for 11 mono-lingual tracks. For leveraging the advantage of all track knowledge we chose transformer-based pretrained models, which have strong cross-lingual transferability. Hence our model trained in two stages, the first stage for multi-lingual learning from all tracks and the second for fine-tuning individual tracks. Our work highlights that the knowledge of all tracks can be transferred to an individual track if the baseline language model has crosslingual features. Our system positioned itself in the top 10 for 4 tracks by scoring 0.7432 macro F1 score for the Hindi track ( 7th rank ) and 0.7322 macro F1 score for the Bangla track ( 9th rank ).
Anthology ID:
2023.semeval-1.164
Volume:
Proceedings of the 17th International Workshop on Semantic Evaluation (SemEval-2023)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Atul Kr. Ojha, A. Seza Doğruöz, Giovanni Da San Martino, Harish Tayyar Madabushi, Ritesh Kumar, Elisa Sartori
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
1183–1189
Language:
URL:
https://aclanthology.org/2023.semeval-1.164
DOI:
10.18653/v1/2023.semeval-1.164
Bibkey:
Cite (ACL):
Sumit Singh and Uma Tiwary. 2023. Silp_nlp at SemEval-2023 Task 2: Cross-lingual Knowledge Transfer for Mono-lingual Learning. In Proceedings of the 17th International Workshop on Semantic Evaluation (SemEval-2023), pages 1183–1189, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Silp_nlp at SemEval-2023 Task 2: Cross-lingual Knowledge Transfer for Mono-lingual Learning (Singh & Tiwary, SemEval 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.semeval-1.164.pdf