About Time: Do Transformers Learn Temporal Verbal Aspect?

Eleni Metheniti, Tim Van De Cruys, Nabil Hathout


Abstract
Aspect is a linguistic concept that describes how an action, event, or state of a verb phrase is situated in time. In this paper, we explore whether different transformer models are capable of identifying aspectual features. We focus on two specific aspectual features: telicity and duration. Telicity marks whether the verb’s action or state has an endpoint or not (telic/atelic), and duration denotes whether a verb expresses an action (dynamic) or a state (stative). These features are integral to the interpretation of natural language, but also hard to annotate and identify with NLP methods. We perform experiments in English and French, and our results show that transformer models adequately capture information on telicity and duration in their vectors, even in their non-finetuned forms, but are somewhat biased with regard to verb tense and word order.
Anthology ID:
2022.cmcl-1.10
Volume:
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Emmanuele Chersoni, Nora Hollenstein, Cassandra Jacobs, Yohei Oseki, Laurent Prévot, Enrico Santus
Venue:
CMCL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
88–101
Language:
URL:
https://aclanthology.org/2022.cmcl-1.10
DOI:
10.18653/v1/2022.cmcl-1.10
Bibkey:
Cite (ACL):
Eleni Metheniti, Tim Van De Cruys, and Nabil Hathout. 2022. About Time: Do Transformers Learn Temporal Verbal Aspect?. In Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics, pages 88–101, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
About Time: Do Transformers Learn Temporal Verbal Aspect? (Metheniti et al., CMCL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.cmcl-1.10.pdf
Video:
 https://aclanthology.org/2022.cmcl-1.10.mp4
Code
 lenakmeth/telicity_classification