Designing Templates for Eliciting Commonsense Knowledge from Pretrained Sequence-to-Sequence Models

Jheng-Hong Yang, Sheng-Chieh Lin, Rodrigo Nogueira, Ming-Feng Tsai, Chuan-Ju Wang, Jimmy Lin


Abstract
While internalized “implicit knowledge” in pretrained transformers has led to fruitful progress in many natural language understanding tasks, how to most effectively elicit such knowledge remains an open question. Based on the text-to-text transfer transformer (T5) model, this work explores a template-based approach to extract implicit knowledge for commonsense reasoning on multiple-choice (MC) question answering tasks. Experiments on three representative MC datasets show the surprisingly good performance of our simple template, coupled with a logit normalization technique for disambiguation. Furthermore, we verify that our proposed template can be easily extended to other MC tasks with contexts such as supporting facts in open-book question answering settings. Starting from the MC task, this work initiates further research to find generic natural language templates that can effectively leverage stored knowledge in pretrained models.
Anthology ID:
2020.coling-main.307
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3449–3453
Language:
URL:
https://aclanthology.org/2020.coling-main.307
DOI:
10.18653/v1/2020.coling-main.307
Bibkey:
Cite (ACL):
Jheng-Hong Yang, Sheng-Chieh Lin, Rodrigo Nogueira, Ming-Feng Tsai, Chuan-Ju Wang, and Jimmy Lin. 2020. Designing Templates for Eliciting Commonsense Knowledge from Pretrained Sequence-to-Sequence Models. In Proceedings of the 28th International Conference on Computational Linguistics, pages 3449–3453, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Designing Templates for Eliciting Commonsense Knowledge from Pretrained Sequence-to-Sequence Models (Yang et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.307.pdf
Data
OpenBookQAWinoGrande