How a Bilingual LM Becomes Bilingual: Tracing Internal Representations with Sparse Autoencoders

Tatsuro Inaba, Go Kamoda, Kentaro Inui, Masaru Isonuma, Yusuke Miyao, Yohei Oseki, Yu Takagi, Benjamin Heinzerling


Abstract
This study explores how bilingual language models develop complex internal representations.We employ sparse autoencoders to analyze internal representations of bilingual language models with a focus on the effects of training steps, layers, and model sizes.Our analysis shows that language models first learn languages separately, and then gradually form bilingual alignments, particularly in the mid layers. We also found that this bilingual tendency is stronger in larger models.Building on these findings, we demonstrate the critical role of bilingual representations in model performance by employing a novel method that integrates decomposed representations from a fully trained model into a mid-training model.Our results provide insights into how language models acquire bilingual capabilities.
Anthology ID:
2025.findings-emnlp.725
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13458–13470
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.725/
DOI:
Bibkey:
Cite (ACL):
Tatsuro Inaba, Go Kamoda, Kentaro Inui, Masaru Isonuma, Yusuke Miyao, Yohei Oseki, Yu Takagi, and Benjamin Heinzerling. 2025. How a Bilingual LM Becomes Bilingual: Tracing Internal Representations with Sparse Autoencoders. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 13458–13470, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
How a Bilingual LM Becomes Bilingual: Tracing Internal Representations with Sparse Autoencoders (Inaba et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.725.pdf
Checklist:
 2025.findings-emnlp.725.checklist.pdf