HW-TSC at SemEval-2022 Task 3: A Unified Approach Fine-tuned on Multilingual Pretrained Model for PreTENS

Yinglu Li, Min Zhang, Xiaosong Qiao, Minghan Wang


Abstract
In the paper, we describe a unified system for task 3 of SemEval-2022. The task aims to recognize the semantic structures of sentences by providing two nominal arguments and to evaluate the degree of taxonomic relations. We utilise the strategy that adding language prefix tag in the training set, which is effective for the model. We split the training set to avoid the translation information to be learnt by the model. For the task, we propose a unified model fine-tuned on the multilingual pretrained model, XLM-RoBERTa. The model performs well in subtask 1 (the binary classification subtask). In order to verify whether our model could also perform better in subtask 2 (the regression subtask), the ranking score is transformed into classification labels by an up-sampling strategy. With the ensemble strategy, the performance of our model can be also improved. As a result, the model obtained the second place for subtask 1 and subtask 2 in the competition evaluation.
Anthology ID:
2022.semeval-1.37
Volume:
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Guy Emerson, Natalie Schluter, Gabriel Stanovsky, Ritesh Kumar, Alexis Palmer, Nathan Schneider, Siddharth Singh, Shyam Ratan
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
291–297
Language:
URL:
https://aclanthology.org/2022.semeval-1.37
DOI:
10.18653/v1/2022.semeval-1.37
Bibkey:
Cite (ACL):
Yinglu Li, Min Zhang, Xiaosong Qiao, and Minghan Wang. 2022. HW-TSC at SemEval-2022 Task 3: A Unified Approach Fine-tuned on Multilingual Pretrained Model for PreTENS. In Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022), pages 291–297, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
HW-TSC at SemEval-2022 Task 3: A Unified Approach Fine-tuned on Multilingual Pretrained Model for PreTENS (Li et al., SemEval 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.semeval-1.37.pdf
Video:
 https://aclanthology.org/2022.semeval-1.37.mp4