Learning Neuro-Symbolic World Models with Conversational Proprioception

Don Joven Agravante, Daiki Kimura, Michiaki Tatsubori, Asim Munawar, Alexander Gray


Abstract
The recent emergence of Neuro-Symbolic Agent (NeSA) approaches to natural language-based interactions calls for the investigation of model-based approaches. In contrast to model-free approaches, which existing NeSAs take, learning an explicit world model has an interesting potential especially in the explainability, which is one of the key selling points of NeSA. To learn useful world models, we leverage one of the recent neuro-symbolic architectures, Logical Neural Networks (LNN). Here, we describe a method that can learn neuro-symbolic world models on the TextWorld-Commonsense set of games. We then show how this can be improved further by taking inspiration from the concept of proprioception, but for conversation. This is done by enhancing the internal logic state with a memory of previous actions while also guiding future actions by augmenting the learned model with constraints based on this memory. This greatly improves the game-solving agents performance in a TextWorld setting, where the advantage over the baseline is an 85% average steps reduction and x2.3 average score.
Anthology ID:
2023.acl-short.57
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
648–656
Language:
URL:
https://aclanthology.org/2023.acl-short.57
DOI:
10.18653/v1/2023.acl-short.57
Bibkey:
Cite (ACL):
Don Joven Agravante, Daiki Kimura, Michiaki Tatsubori, Asim Munawar, and Alexander Gray. 2023. Learning Neuro-Symbolic World Models with Conversational Proprioception. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 648–656, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Learning Neuro-Symbolic World Models with Conversational Proprioception (Agravante et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.57.pdf
Video:
 https://aclanthology.org/2023.acl-short.57.mp4