Multitasking Inhibits Semantic Drift

Athul Paul Jacob, Mike Lewis, Jacob Andreas


Abstract
When intelligent agents communicate to accomplish shared goals, how do these goals shape the agents’ language? We study the dynamics of learning in latent language policies (LLPs), in which instructor agents generate natural-language subgoal descriptions and executor agents map these descriptions to low-level actions. LLPs can solve challenging long-horizon reinforcement learning problems and provide a rich model for studying task-oriented language use. But previous work has found that LLP training is prone to semantic drift (use of messages in ways inconsistent with their original natural language meanings). Here, we demonstrate theoretically and empirically that multitask training is an effective counter to this problem: we prove that multitask training eliminates semantic drift in a well-studied family of signaling games, and show that multitask training of neural LLPs in a complex strategy game reduces drift and while improving sample efficiency.
Anthology ID:
2021.naacl-main.421
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5351–5366
Language:
URL:
https://aclanthology.org/2021.naacl-main.421
DOI:
10.18653/v1/2021.naacl-main.421
Bibkey:
Cite (ACL):
Athul Paul Jacob, Mike Lewis, and Jacob Andreas. 2021. Multitasking Inhibits Semantic Drift. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 5351–5366, Online. Association for Computational Linguistics.
Cite (Informal):
Multitasking Inhibits Semantic Drift (Jacob et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.421.pdf
Video:
 https://aclanthology.org/2021.naacl-main.421.mp4