Text-centric Alignment for Bridging Test-time Unseen Modality

Yun-Da Tsai, Ting-Yu Yen, Pei-Fu Guo, Zhe-Yan Li, Shou-De Lin


Abstract
This paper addresses the challenge of handling unseen modalities and dynamic modality combinations at test time with our proposed text-centric alignment method. This training-free alignment approach unifies different input modalities into a single semantic text representation by leveraging in-context learning with Large Language Models and uni-modal foundation models. Our method significantly enhances the ability to manage unseen, diverse, and unpredictable modality combinations, making it suitable for both generative and discriminative models to adopt on top. Our extensive experiments primarily evaluate on discriminative tasks, demonstrating that our approach is essential for LLMs to achieve strong modality alignment performance. It also surpasses the limitations of traditional fixed-modality frameworks in embedding representations. This study contributes to the field by offering a flexible and effective solution for real-world applications where modality availability is dynamic and uncertain.
Anthology ID:
2025.findings-emnlp.206
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3826–3845
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.206/
DOI:
Bibkey:
Cite (ACL):
Yun-Da Tsai, Ting-Yu Yen, Pei-Fu Guo, Zhe-Yan Li, and Shou-De Lin. 2025. Text-centric Alignment for Bridging Test-time Unseen Modality. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 3826–3845, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Text-centric Alignment for Bridging Test-time Unseen Modality (Tsai et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.206.pdf
Checklist:
 2025.findings-emnlp.206.checklist.pdf