tagE: Enabling an Embodied Agent to Understand Human Instructions

Chayan Sarkar, Avik Mitra, Pradip Pramanick, Tapas Nayak


Abstract
Natural language serves as the primary mode of communication when an intelligent agent with a physical presence engages with human beings. While a plethora of research focuses on natural language understanding (NLU), encompassing endeavors such as sentiment analysis, intent prediction, question answering, and summarization, the scope of NLU directed at situations necessitating tangible actions by an embodied agent remains limited. The inherent ambiguity and incompleteness inherent in natural language present challenges for intelligent agents striving to decipher human intention. To tackle this predicament head-on, we introduce a novel system known as task and argument grounding for Embodied agents (tagE). At its core, our system employs an inventive neural network model designed to extract a series of tasks from complex task instructions expressed in natural language. Our proposed model adopts an encoder-decoder framework enriched with nested decoding to effectively extract tasks and their corresponding arguments from these intricate instructions. These extracted tasks are then mapped (or grounded) to the robot’s established collection of skills, while the arguments find grounding in objects present within the environment. To facilitate the training and evaluation of our system, we have curated a dataset featuring complex instructions. The results of our experiments underscore the prowess of our approach, as it outperforms robust baseline models.
Anthology ID:
2023.findings-emnlp.593
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8846–8857
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.593
DOI:
10.18653/v1/2023.findings-emnlp.593
Bibkey:
Cite (ACL):
Chayan Sarkar, Avik Mitra, Pradip Pramanick, and Tapas Nayak. 2023. tagE: Enabling an Embodied Agent to Understand Human Instructions. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 8846–8857, Singapore. Association for Computational Linguistics.
Cite (Informal):
tagE: Enabling an Embodied Agent to Understand Human Instructions (Sarkar et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.593.pdf