Do pretrained transformers infer telicity like humans?

Yiyun Zhao, Jian Gang Ngui, Lucy Hall Hartley, Steven Bethard


Abstract
Pretrained transformer-based language models achieve state-of-the-art performance in many NLP tasks, but it is an open question whether the knowledge acquired by the models during pretraining resembles the linguistic knowledge of humans. We present both humans and pretrained transformers with descriptions of events, and measure their preference for telic interpretations (the event has a natural endpoint) or atelic interpretations (the event does not have a natural endpoint). To measure these preferences and determine what factors influence them, we design an English test and a novel-word test that include a variety of linguistic cues (noun phrase quantity, resultative structure, contextual information, temporal units) that bias toward certain interpretations. We find that humans’ choice of telicity interpretation is reliably influenced by theoretically-motivated cues, transformer models (BERT and RoBERTa) are influenced by some (though not all) of the cues, and transformer models often rely more heavily on temporal units than humans do.
Anthology ID:
2021.conll-1.6
Original:
2021.conll-1.6v1
Version 2:
2021.conll-1.6v2
Volume:
Proceedings of the 25th Conference on Computational Natural Language Learning
Month:
November
Year:
2021
Address:
Online
Editors:
Arianna Bisazza, Omri Abend
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
72–81
Language:
URL:
https://aclanthology.org/2021.conll-1.6
DOI:
10.18653/v1/2021.conll-1.6
Bibkey:
Cite (ACL):
Yiyun Zhao, Jian Gang Ngui, Lucy Hall Hartley, and Steven Bethard. 2021. Do pretrained transformers infer telicity like humans?. In Proceedings of the 25th Conference on Computational Natural Language Learning, pages 72–81, Online. Association for Computational Linguistics.
Cite (Informal):
Do pretrained transformers infer telicity like humans? (Zhao et al., CoNLL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.conll-1.6.pdf
Video:
 https://aclanthology.org/2021.conll-1.6.mp4