Multi-agent Communication meets Natural Language: Synergies between Functional and Structural Language Learning

Angeliki Lazaridou, Anna Potapenko, Olivier Tieleman


Abstract
We present a method for combining multi-agent communication and traditional data-driven approaches to natural language learning, with an end goal of teaching agents to communicate with humans in natural language. Our starting point is a language model that has been trained on generic, not task-specific language data. We then place this model in a multi-agent self-play environment that generates task-specific rewards used to adapt or modulate the model, turning it into a task-conditional language model. We introduce a new way for combining the two types of learning based on the idea of reranking language model samples, and show that this method outperforms others in communicating with humans in a visual referential communication task. Finally, we present a taxonomy of different types of language drift that can occur alongside a set of measures to detect them.
Anthology ID:
2020.acl-main.685
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7663–7674
Language:
URL:
https://aclanthology.org/2020.acl-main.685
DOI:
10.18653/v1/2020.acl-main.685
Bibkey:
Cite (ACL):
Angeliki Lazaridou, Anna Potapenko, and Olivier Tieleman. 2020. Multi-agent Communication meets Natural Language: Synergies between Functional and Structural Language Learning. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7663–7674, Online. Association for Computational Linguistics.
Cite (Informal):
Multi-agent Communication meets Natural Language: Synergies between Functional and Structural Language Learning (Lazaridou et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.685.pdf
Video:
 http://slideslive.com/38929310