MMA: Cross-Domain Knowledge Integration via Mixture of Multi-Domain Agents

Kehang Jia, Juntao Li, Xiaobo Liang, Yisheng Xiao, Yixuan Yang, Min Zhang


Abstract
Rather than merely to retain previously acquired generalization, achieving synergistic improvements between generalization and domain specialization in foundation models remains a significant challenge in both pre-training and post-training. As an alternative, we propose a test-time cross-domain knowledge integration method, Mixture of Multi-domain Agents (MMA), which dynamically combines the outputs of general-purpose and domain-specific models to enhance their performance on complex, domain‐specific tasks. MMA formulates the integration process as a search problem, using Monte Carlo Tree Search (MCTS) to find the path that optimally harmonizes the respective strengths of different models in generalization and domain-specific knowledge. In addition, We design specific action spaces to control the knowledge integration between multiple models, and cross-inspection reward is introduced to fairly score strategies in different domains. Experiments in diverse domains show that MMA can effectively combine the strengths of different models to enhance their performance. For instance, in legal tests, the average performance of all tasks increased from 42.57% to 53.68%. In financial tests, it improved from 56.01% to 62.68%.
Anthology ID:
2025.findings-emnlp.707
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13145–13160
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.707/
DOI:
Bibkey:
Cite (ACL):
Kehang Jia, Juntao Li, Xiaobo Liang, Yisheng Xiao, Yixuan Yang, and Min Zhang. 2025. MMA: Cross-Domain Knowledge Integration via Mixture of Multi-Domain Agents. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 13145–13160, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
MMA: Cross-Domain Knowledge Integration via Mixture of Multi-Domain Agents (Jia et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.707.pdf
Checklist:
 2025.findings-emnlp.707.checklist.pdf