Using Deep Mixture-of-Experts to Detect Word Meaning Shift for TempoWiC

Ze Chen, Kangxu Wang, Zijian Cai, Jiewen Zheng, Jiarong He, Max Gao, Jason Zhang


Abstract
This paper mainly describes the dma submission to the TempoWiC task, which achieves a macro-F1 score of 77.05% and attains the first place in this task. We first explore the impact of different pre-trained language models. Then we adopt data cleaning, data augmentation, and adversarial training strategies to enhance the model generalization and robustness. For further improvement, we integrate POS information and word semantic representation using a Mixture-of-Experts (MoE) approach. The experimental results show that MoE can overcome the feature overuse issue and combine the context, POS, and word semantic features well. Additionally, we use a model ensemble method for the final prediction, which has been proven effective by many research works.
Anthology ID:
2022.evonlp-1.2
Volume:
Proceedings of the First Workshop on Ever Evolving NLP (EvoNLP)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Francesco Barbieri, Jose Camacho-Collados, Bhuwan Dhingra, Luis Espinosa-Anke, Elena Gribovskaya, Angeliki Lazaridou, Daniel Loureiro, Leonardo Neves
Venue:
EvoNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7–11
Language:
URL:
https://aclanthology.org/2022.evonlp-1.2
DOI:
10.18653/v1/2022.evonlp-1.2
Bibkey:
Cite (ACL):
Ze Chen, Kangxu Wang, Zijian Cai, Jiewen Zheng, Jiarong He, Max Gao, and Jason Zhang. 2022. Using Deep Mixture-of-Experts to Detect Word Meaning Shift for TempoWiC. In Proceedings of the First Workshop on Ever Evolving NLP (EvoNLP), pages 7–11, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
Using Deep Mixture-of-Experts to Detect Word Meaning Shift for TempoWiC (Chen et al., EvoNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.evonlp-1.2.pdf
Video:
 https://aclanthology.org/2022.evonlp-1.2.mp4