%0 Conference Proceedings %T PLM-based World Models for Text-based Games %A Kim, Minsoo %A Jung, Yeonjoon %A Lee, Dohyeon %A Hwang, Seung-won %Y Goldberg, Yoav %Y Kozareva, Zornitsa %Y Zhang, Yue %S Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing %D 2022 %8 December %I Association for Computational Linguistics %C Abu Dhabi, United Arab Emirates %F kim-etal-2022-plm %X World models have improved the ability of reinforcement learning agents to operate in a sample efficient manner, by being trained to predict plausible changes in the underlying environment. As the core tasks of world models are future prediction and commonsense understanding, our claim is that pre-trained language models (PLMs) already provide a strong base upon which to build world models. Worldformer is a recently proposed world model for text-based game environments, based only partially on PLM and transformers. Our distinction is to fully leverage PLMs as actionable world models in text-based game environments, by reformulating generation as constrained decoding which decomposes actions into verb templates and objects. We show that our model improves future valid action prediction and graph change prediction. Additionally, we show that our model better reflects commonsense than standard PLM. %R 10.18653/v1/2022.emnlp-main.86 %U https://aclanthology.org/2022.emnlp-main.86 %U https://doi.org/10.18653/v1/2022.emnlp-main.86 %P 1324-1341