Production-based Cognitive Models as a Test Suite for Reinforcement Learning Algorithms

Adrian Brasoveanu, Jakub Dotlacil


Abstract
We introduce a framework in which production-rule based computational cognitive modeling and Reinforcement Learning can systematically interact and inform each other. We focus on linguistic applications because the sophisticated rule-based cognitive models needed to capture linguistic behavioral data promise to provide a stringent test suite for RL algorithms, connecting RL algorithms to both accuracy and reaction-time experimental data. Thus, we open a path towards assembling an experimentally rigorous and cognitively realistic benchmark for RL algorithms. We extend our previous work on lexical decision tasks and tabular RL algorithms (Brasoveanu and Dotlačil, 2020b) with a discussion of neural-network based approaches, and a discussion of how parsing can be formalized as an RL problem.
Anthology ID:
2020.cmcl-1.3
Volume:
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics
Month:
November
Year:
2020
Address:
Online
Editors:
Emmanuele Chersoni, Cassandra Jacobs, Yohei Oseki, Laurent Prévot, Enrico Santus
Venue:
CMCL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
28–37
Language:
URL:
https://aclanthology.org/2020.cmcl-1.3
DOI:
10.18653/v1/2020.cmcl-1.3
Bibkey:
Cite (ACL):
Adrian Brasoveanu and Jakub Dotlacil. 2020. Production-based Cognitive Models as a Test Suite for Reinforcement Learning Algorithms. In Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics, pages 28–37, Online. Association for Computational Linguistics.
Cite (Informal):
Production-based Cognitive Models as a Test Suite for Reinforcement Learning Algorithms (Brasoveanu & Dotlacil, CMCL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.cmcl-1.3.pdf