A Mixture-of-Experts Model for Antonym-Synonym Discrimination

Zhipeng Xie, Nan Zeng


Abstract
Discrimination between antonyms and synonyms is an important and challenging NLP task. Antonyms and synonyms often share the same or similar contexts and thus are hard to make a distinction. This paper proposes two underlying hypotheses and employs the mixture-of-experts framework as a solution. It works on the basis of a divide-and-conquer strategy, where a number of localized experts focus on their own domains (or subspaces) to learn their specialties, and a gating mechanism determines the space partitioning and the expert mixture. Experimental results have shown that our method achieves the state-of-the-art performance on the task.
Anthology ID:
2021.acl-short.71
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
558–564
Language:
URL:
https://aclanthology.org/2021.acl-short.71
DOI:
10.18653/v1/2021.acl-short.71
Bibkey:
Cite (ACL):
Zhipeng Xie and Nan Zeng. 2021. A Mixture-of-Experts Model for Antonym-Synonym Discrimination. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 558–564, Online. Association for Computational Linguistics.
Cite (Informal):
A Mixture-of-Experts Model for Antonym-Synonym Discrimination (Xie & Zeng, ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-short.71.pdf
Optional supplementary material:
 2021.acl-short.71.OptionalSupplementaryMaterial.zip
Video:
 https://aclanthology.org/2021.acl-short.71.mp4
Code
 zengnan1997/moe-asd