Attention-based Contrastive Learning for Winograd Schemas

Tassilo Klein, Moin Nabi


Abstract
Self-supervised learning has recently attracted considerable attention in the NLP community for its ability to learn discriminative features using a contrastive objective. This paper investigates whether contrastive learning can be extended to Transfomer attention to tackling the Winograd Schema Challenge. To this end, we propose a novel self-supervised framework, leveraging a contrastive loss directly at the level of self-attention. Experimental analysis of our attention-based models on multiple datasets demonstrates superior commonsense reasoning capabilities. The proposed approach outperforms all comparable unsupervised approaches while occasionally surpassing supervised ones.
Anthology ID:
2021.findings-emnlp.208
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2428–2434
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.208
DOI:
10.18653/v1/2021.findings-emnlp.208
Bibkey:
Cite (ACL):
Tassilo Klein and Moin Nabi. 2021. Attention-based Contrastive Learning for Winograd Schemas. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 2428–2434, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Attention-based Contrastive Learning for Winograd Schemas (Klein & Nabi, Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.208.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.208.mp4
Code
 sap-samples/emnlp2021-attention-contrastive-learning
Data
WSCWinoGrande