Compositionality and Capacity in Emergent Languages

Abhinav Gupta, Cinjon Resnick, Jakob Foerster, Andrew Dai, Kyunghyun Cho


Abstract
Recent works have discussed the extent to which emergent languages can exhibit properties of natural languages particularly learning compositionality. In this paper, we investigate the learning biases that affect the efficacy and compositionality in multi-agent communication in addition to the communicative bandwidth. Our foremost contribution is to explore how the capacity of a neural network impacts its ability to learn a compositional language. We additionally introduce a set of evaluation metrics with which we analyze the learned languages. Our hypothesis is that there should be a specific range of model capacity and channel bandwidth that induces compositional structure in the resulting language and consequently encourages systematic generalization. While we empirically see evidence for the bottom of this range, we curiously do not find evidence for the top part of the range and believe that this is an open question for the community.
Anthology ID:
2020.repl4nlp-1.5
Original:
2020.repl4nlp-1.5v1
Version 2:
2020.repl4nlp-1.5v2
Volume:
Proceedings of the 5th Workshop on Representation Learning for NLP
Month:
July
Year:
2020
Address:
Online
Venues:
ACL | RepL4NLP | WS
SIG:
SIGREP
Publisher:
Association for Computational Linguistics
Note:
Pages:
34–38
Language:
URL:
https://aclanthology.org/2020.repl4nlp-1.5
DOI:
10.18653/v1/2020.repl4nlp-1.5
Bibkey:
Cite (ACL):
Abhinav Gupta, Cinjon Resnick, Jakob Foerster, Andrew Dai, and Kyunghyun Cho. 2020. Compositionality and Capacity in Emergent Languages. In Proceedings of the 5th Workshop on Representation Learning for NLP, pages 34–38, Online. Association for Computational Linguistics.
Cite (Informal):
Compositionality and Capacity in Emergent Languages (Gupta et al., RepL4NLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.repl4nlp-1.5.pdf
Video:
 http://slideslive.com/38929771