Chendi Xue
2025
Probing Semantic Routing in Large Mixture-of-Expert Models
Matthew Lyle Olson
|
Neale Ratzlaff
|
Musashi Hinck
|
Man Luo
|
Sungduk Yu
|
Chendi Xue
|
Vasudev Lal
Findings of the Association for Computational Linguistics: EMNLP 2025
In the past year, large (>100B parameter) mixture-of-expert (MoE) models have become increasingly common in the open domain. While their advantages are often framed in terms of efficiency, prior work has also explored functional differentiation through routing behavior. We investigate whether expert routing in large MoE models is influenced by the semantics of the inputs. To test this, we design two controlled experiments. First, we compare activations on sentence pairs with a shared target word used in the same or different senses. Second, we fix context and substitute the target word with semantically similar or dissimilar alternatives. Comparing expert overlap across these conditions reveals clear, statistically significant evidence of semantic routing in large MoE models.
Search
Fix author
Co-authors
- Musashi Hinck 1
- Vasudev Lal 1
- Man Luo 1
- Matthew Lyle Olson 1
- Neale Ratzlaff 1
- show all...