XMoE: Sparse Models with Fine-grained and Adaptive Expert Selection

Yuanhang Yang, Shiyi Qi, Wenchao Gu, Chaozheng Wang, Cuiyun Gao, Zenglin Xu


Abstract
Sparse models, including sparse Mixture-of-Experts (MoE) models, have emerged as an effective approach for scaling Transformer models. However, they often suffer from computational inefficiency since a significant number of parameters are unnecessarily involved in computations by multiplying values by zero or low activation values. To address this issue, we present XMoE, a novel MoE designed to enhance both the efficacy and efficiency of sparse MoE models. XMoE leverages small experts and a threshold-based router to enable tokens to selectively engage only essential parameters. Our extensive experiments on language modeling and machine translation tasks demonstrate that enhances model performance and can decrease the computation load at MoE layers by over 50% without sacrificing performance. Furthermore, we present the versatility of by applying it to dense models, enabling sparse computation during inference. We provide a comprehensive analysis and make our code available at https://anonymous.4open.science/r/XMoE.
Anthology ID:
2024.findings-acl.694
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11664–11674
Language:
URL:
https://aclanthology.org/2024.findings-acl.694
DOI:
Bibkey:
Cite (ACL):
Yuanhang Yang, Shiyi Qi, Wenchao Gu, Chaozheng Wang, Cuiyun Gao, and Zenglin Xu. 2024. XMoE: Sparse Models with Fine-grained and Adaptive Expert Selection. In Findings of the Association for Computational Linguistics ACL 2024, pages 11664–11674, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
XMoE: Sparse Models with Fine-grained and Adaptive Expert Selection (Yang et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.694.pdf