Exploring the Capacity of Pretrained Language Models for Reasoning about Actions and Change

Weinan He, Canming Huang, Zhanhao Xiao, Yongmei Liu


Abstract
Reasoning about actions and change (RAC) is essential to understand and interact with the ever-changing environment. Previous AI research has shown the importance of fundamental and indispensable knowledge of actions, i.e., preconditions and effects. However, traditional methods rely on logical formalization which hinders practical applications. With recent transformer-based language models (LMs), reasoning over text is desirable and seemingly feasible, leading to the question of whether LMs can effectively and efficiently learn to solve RAC problems. We propose four essential RAC tasks as a comprehensive textual benchmark and generate problems in a way that minimizes the influence of other linguistic requirements (e.g., grounding) to focus on RAC. The resulting benchmark, TRAC, encompassing problems of various complexities, facilitates a more granular evaluation of LMs, precisely targeting the structural generalization ability much needed for RAC. Experiments with three high-performing transformers indicate that additional efforts are needed to tackle challenges raised by TRAC.
Anthology ID:
2023.acl-long.255
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4629–4643
Language:
URL:
https://aclanthology.org/2023.acl-long.255
DOI:
10.18653/v1/2023.acl-long.255
Bibkey:
Cite (ACL):
Weinan He, Canming Huang, Zhanhao Xiao, and Yongmei Liu. 2023. Exploring the Capacity of Pretrained Language Models for Reasoning about Actions and Change. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4629–4643, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Exploring the Capacity of Pretrained Language Models for Reasoning about Actions and Change (He et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.255.pdf
Video:
 https://aclanthology.org/2023.acl-long.255.mp4