The Effect of Efficient Messaging and Input Variability on Neural-Agent Iterated Language Learning

Yuchen Lian, Arianna Bisazza, Tessa Verhoef


Abstract
Natural languages display a trade-off among different strategies to convey syntactic structure, such as word order or inflection. This trade-off, however, has not appeared in recent simulations of iterated language learning with neural network agents (Chaabouni et al., 2019b). We re-evaluate this result in light of three factors that play an important role in comparable experiments from the Language Evolution field: (i) speaker bias towards efficient messaging, (ii) non systematic input languages, and (iii) learning bottleneck. Our simulations show that neural agents mainly strive to maintain the utterance type distribution observed during learning, instead of developing a more efficient or systematic language.
Anthology ID:
2021.emnlp-main.794
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10121–10129
Language:
URL:
https://aclanthology.org/2021.emnlp-main.794
DOI:
10.18653/v1/2021.emnlp-main.794
Bibkey:
Cite (ACL):
Yuchen Lian, Arianna Bisazza, and Tessa Verhoef. 2021. The Effect of Efficient Messaging and Input Variability on Neural-Agent Iterated Language Learning. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 10121–10129, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
The Effect of Efficient Messaging and Input Variability on Neural-Agent Iterated Language Learning (Lian et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.794.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.794.mp4