Attention Is (not) All You Need for Commonsense Reasoning

Tassilo Klein, Moin Nabi


Abstract
The recently introduced BERT model exhibits strong performance on several language understanding benchmarks. In this paper, we describe a simple re-implementation of BERT for commonsense reasoning. We show that the attentions produced by BERT can be directly utilized for tasks such as the Pronoun Disambiguation Problem and Winograd Schema Challenge. Our proposed attention-guided commonsense reasoning method is conceptually simple yet empirically powerful. Experimental analysis on multiple datasets demonstrates that our proposed system performs remarkably well on all cases while outperforming the previously reported state of the art by a margin. While results suggest that BERT seems to implicitly learn to establish complex relationships between entities, solving commonsense reasoning tasks might require more than unsupervised models learned from huge text corpora.
Anthology ID:
P19-1477
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4831–4836
Language:
URL:
https://aclanthology.org/P19-1477
DOI:
10.18653/v1/P19-1477
Bibkey:
Cite (ACL):
Tassilo Klein and Moin Nabi. 2019. Attention Is (not) All You Need for Commonsense Reasoning. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 4831–4836, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Attention Is (not) All You Need for Commonsense Reasoning (Klein & Nabi, ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1477.pdf
Code
 additional community code
Data
WSC