NeuronMoE: Efficient Cross-Lingual Extension via Neuron-Guided Mixture-of-Experts

Rongzhi Li, Hitomi Yanaka


Abstract
Extending large language models to low-resource languages is essential for global accessibility, but training separate models per language is prohibitively expensive. Mixture-of-Experts (MoE) architectures address this by adding sparse language-specific parameters, but determining how many experts each layer needs remains an open question. Current approaches allocate experts based on layer-level similarity, yet language processing exhibits fine-grained specialization at individual neurons. We propose NeuronMoE, a method that analyzes language-specific neurons across all transformer components to guide expert allocation per layer based on empirically measured cross-lingual neuron diversity. Applied to Llama-3.2-3B for low-resource languages (Greek, Turkish, and Hungarian), this approach achieves approximately 40% average parameter reduction while matching the performance of the LayerMoE baseline. We find that low-resource language experts independently develop neuron specialization patterns mirroring the high-resource language, which are concentrated in early and late layers. This reveals potential universal architectural principles in how multilingual models organize linguistic knowledge. Our approach generalizes across architectures, as validated on Qwen, and shows that allocation strategy matters more than total expert count.
Anthology ID:
2026.eacl-long.117
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2573–2586
Language:
URL:
https://aclanthology.org/2026.eacl-long.117/
DOI:
Bibkey:
Cite (ACL):
Rongzhi Li and Hitomi Yanaka. 2026. NeuronMoE: Efficient Cross-Lingual Extension via Neuron-Guided Mixture-of-Experts. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2573–2586, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
NeuronMoE: Efficient Cross-Lingual Extension via Neuron-Guided Mixture-of-Experts (Li & Yanaka, EACL 2026)
Copy Citation:
PDF:
https://aclanthology.org/2026.eacl-long.117.pdf
Checklist:
 2026.eacl-long.117.checklist.pdf