Searching for Structure: Investigating Emergent Communication with Large Language Models

Tom Kouwenhoven, Max Peeperkorn, Tessa Verhoef


Abstract
Human languages have evolved to be structured through repeated language learning and use. These processes introduce biases that operate during language acquisition and shape linguistic systems toward communicative efficiency. In this paper, we investigate whether the same happens if artificial languages are optimised for implicit biases of Large Language Models (LLMs). To this end, we simulate a classical referential game in which LLMs learn and use artificial languages. Our results show that initially unstructured holistic languages are indeed shaped to have some structural properties that allow two LLM agents to communicate successfully. Similar to observations in human experiments, generational transmission increases the learnability of languages, but can at the same time result in non-humanlike degenerate vocabularies. Taken together, this work extends experimental findings, shows that LLMs can be used as tools in simulations of language evolution, and opens possibilities for future human-machine experiments in this field.
Anthology ID:
2025.coling-main.667
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9977–9991
Language:
URL:
https://aclanthology.org/2025.coling-main.667/
DOI:
Bibkey:
Cite (ACL):
Tom Kouwenhoven, Max Peeperkorn, and Tessa Verhoef. 2025. Searching for Structure: Investigating Emergent Communication with Large Language Models. In Proceedings of the 31st International Conference on Computational Linguistics, pages 9977–9991, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Searching for Structure: Investigating Emergent Communication with Large Language Models (Kouwenhoven et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.667.pdf