BTW: A Non-Parametric Variance Stabilization Framework for Multimodal Model Integration

Jun Hou, Le Wang, Xuan Wang


Abstract
Mixture-of-Experts (MoE) models have become increasingly powerful in multimodal learning by enabling modular specialization across modalities. However, their effectiveness remains unclear when additional modalities introduce more noise than complementary information. Existing approaches, such as the Partial Information Decomposition, struggle to scale beyond two modalities and lack the resolution needed for instance-level control. We propose **B**eyond **T**wo-modality **W**eighting (**BTW**), a bi-level, non-parametric weighting framework that combines instance-level Kullback-Leibler (KL) divergence and modality-level mutual information (MI) to dynamically adjust modality importance during training. Our method does not require additional parameters and can be applied to an arbitrary number of modalities. Specifically, BTW computes per-example KL weights by measuring the divergence between each unimodal and the current multimodal prediction, and modality-wide MI weights by estimating global alignment between unimodal and multimodal outputs. Extensive experiments on sentiment regression and clinical classification demonstrate that our method significantly improves regression performance and multiclass classification accuracy.
Anthology ID:
2025.findings-emnlp.815
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15089–15103
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.815/
DOI:
Bibkey:
Cite (ACL):
Jun Hou, Le Wang, and Xuan Wang. 2025. BTW: A Non-Parametric Variance Stabilization Framework for Multimodal Model Integration. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 15089–15103, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
BTW: A Non-Parametric Variance Stabilization Framework for Multimodal Model Integration (Hou et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.815.pdf
Checklist:
 2025.findings-emnlp.815.checklist.pdf