NeLLCom-X: A Comprehensive Neural-Agent Framework to Simulate Language Learning and Group Communication

Yuchen Lian, Tessa Verhoef, Arianna Bisazza


Abstract
Recent advances in computational linguistics include simulating the emergence of human-like languages with interacting neural network agents, starting from sets of random symbols. The recently introduced NeLLCom framework (Lian et al., 2023) allows agents to first learn an artificial language and then use it to communicate, with the aim of studying the emergence of specific linguistics properties. We extend this framework (NeLLCom-X) by introducing more realistic role-alternating agents and group communication in order to investigate the interplay between language learnability, communication pressures, and group size effects. We validate NeLLCom-X by replicating key findings from prior research simulating the emergence of a word-order/case-marking trade-off. Next, we investigate how interaction affects linguistic convergence and emergence of the trade-off. The novel framework facilitates future simulations of diverse linguistic aspects, emphasizing the importance of interaction and group dynamics in language evolution.
Anthology ID:
2024.conll-1.19
Volume:
Proceedings of the 28th Conference on Computational Natural Language Learning
Month:
November
Year:
2024
Address:
Miami, FL, USA
Editors:
Libby Barak, Malihe Alikhani
Venue:
CoNLL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
243–258
Language:
URL:
https://aclanthology.org/2024.conll-1.19
DOI:
Bibkey:
Cite (ACL):
Yuchen Lian, Tessa Verhoef, and Arianna Bisazza. 2024. NeLLCom-X: A Comprehensive Neural-Agent Framework to Simulate Language Learning and Group Communication. In Proceedings of the 28th Conference on Computational Natural Language Learning, pages 243–258, Miami, FL, USA. Association for Computational Linguistics.
Cite (Informal):
NeLLCom-X: A Comprehensive Neural-Agent Framework to Simulate Language Learning and Group Communication (Lian et al., CoNLL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.conll-1.19.pdf