Zhestyatsky at SemEval-2021 Task 2: ReLU over Cosine Similarity for BERT Fine-tuning

Boris Zhestiankin, Maria Ponomareva


Abstract
This paper presents our contribution to SemEval-2021 Task 2: Multilingual and Cross-lingual Word-in-Context Disambiguation (MCL-WiC). Our experiments cover English (EN-EN) sub-track from the multilingual setting of the task. We experiment with several pre-trained language models and investigate an impact of different top-layers on fine-tuning. We find the combination of Cosine Similarity and ReLU activation leading to the most effective fine-tuning procedure. Our best model results in accuracy 92.7%, which is the fourth-best score in EN-EN sub-track.
Anthology ID:
2021.semeval-1.17
Volume:
Proceedings of the 15th International Workshop on Semantic Evaluation (SemEval-2021)
Month:
August
Year:
2021
Address:
Online
Editors:
Alexis Palmer, Nathan Schneider, Natalie Schluter, Guy Emerson, Aurelie Herbelot, Xiaodan Zhu
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
163–168
Language:
URL:
https://aclanthology.org/2021.semeval-1.17
DOI:
10.18653/v1/2021.semeval-1.17
Bibkey:
Cite (ACL):
Boris Zhestiankin and Maria Ponomareva. 2021. Zhestyatsky at SemEval-2021 Task 2: ReLU over Cosine Similarity for BERT Fine-tuning. In Proceedings of the 15th International Workshop on Semantic Evaluation (SemEval-2021), pages 163–168, Online. Association for Computational Linguistics.
Cite (Informal):
Zhestyatsky at SemEval-2021 Task 2: ReLU over Cosine Similarity for BERT Fine-tuning (Zhestiankin & Ponomareva, SemEval 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.semeval-1.17.pdf
Code
 zhestyatsky/MCL-WiC
Data
SuperGLUEWiC