Draw Me a Flower: Processing and Grounding Abstraction in Natural Language

Royi Lachmy, Valentina Pyatkin, Avshalom Manevich, Reut Tsarfaty


Abstract
Abstraction is a core tenet of human cognition and communication. When composing natural language instructions, humans naturally evoke abstraction to convey complex procedures in an efficient and concise way. Yet, interpreting and grounding abstraction expressed in NL has not yet been systematically studied in NLP, with no accepted benchmarks specifically eliciting abstraction in NL. In this work, we set the foundation for a systematic study of processing and grounding abstraction in NLP. First, we deliver a novel abstraction elicitation method and present Hexagons, a 2D instruction-following game. Using Hexagons we collected over 4k naturally occurring visually-grounded instructions rich with diverse types of abstractions. From these data, we derive an instruction-to-execution task and assess different types of neural models. Our results show that contemporary models and modeling practices are substantially inferior to human performance, and that model performance is inversely correlated with the level of abstraction, showing less satisfying performance on higher levels of abstraction. These findings are consistent across models and setups, confirming that abstraction is a challenging phenomenon deserving further attention and study in NLP/AI research.
Anthology ID:
2022.tacl-1.77
Volume:
Transactions of the Association for Computational Linguistics, Volume 10
Month:
Year:
2022
Address:
Cambridge, MA
Editors:
Brian Roark, Ani Nenkova
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
1341–1356
Language:
URL:
https://aclanthology.org/2022.tacl-1.77
DOI:
10.1162/tacl_a_00522
Bibkey:
Cite (ACL):
Royi Lachmy, Valentina Pyatkin, Avshalom Manevich, and Reut Tsarfaty. 2022. Draw Me a Flower: Processing and Grounding Abstraction in Natural Language. Transactions of the Association for Computational Linguistics, 10:1341–1356.
Cite (Informal):
Draw Me a Flower: Processing and Grounding Abstraction in Natural Language (Lachmy et al., TACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.tacl-1.77.pdf