Don’t Copy the Teacher: Data and Model Challenges in Embodied Dialogue

So Yeon Min, Hao Zhu, Ruslan Salakhutdinov, Yonatan Bisk


Abstract
Embodied dialogue instruction following requires an agent to complete a complex sequence of tasks from a natural language exchange. The recent introduction of benchmarks raises the question of how best to train and evaluate models for this multi-turn, multi-agent, long-horizon task. This paper contributes to that conversation, by arguing that imitation learning (IL) and related low-level metrics are actually misleading and do not align with the goals of embodied dialogue research and may hinder progress. We provide empirical comparisons of metrics, analysis of three models, and make suggestions for how the field might best progress. First, we observe that models trained with IL take spurious actions during evaluation. Second, we find that existing models fail to ground query utterances, which are essential for task completion. Third, we argue evaluation should focus on higher-level semantic goals. We will release code to additionally filter the data and benchmark models for improved evaluation.
Anthology ID:
2022.emnlp-main.635
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9361–9368
Language:
URL:
https://aclanthology.org/2022.emnlp-main.635
DOI:
10.18653/v1/2022.emnlp-main.635
Bibkey:
Cite (ACL):
So Yeon Min, Hao Zhu, Ruslan Salakhutdinov, and Yonatan Bisk. 2022. Don’t Copy the Teacher: Data and Model Challenges in Embodied Dialogue. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 9361–9368, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Don’t Copy the Teacher: Data and Model Challenges in Embodied Dialogue (Min et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.635.pdf