Are Transformers a Modern Version of ELIZA? Observations on French Object Verb Agreement

Bingzhi Li, Guillaume Wisniewski, Benoit Crabbé


Abstract
Many recent works have demonstrated that unsupervised sentence representations of neural networks encode syntactic information by observing that neural language models are able to predict the agreement between a verb and its subject. We take a critical look at this line of research by showing that it is possible to achieve high accuracy on this agreement task with simple surface heuristics, indicating a possible flaw in our assessment of neural networks’ syntactic ability. Our fine-grained analyses of results on the long-range French object-verb agreement show that contrary to LSTMs, Transformers are able to capture a non-trivial amount of grammatical structure.
Anthology ID:
2021.emnlp-main.377
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4599–4610
Language:
URL:
https://aclanthology.org/2021.emnlp-main.377
DOI:
10.18653/v1/2021.emnlp-main.377
Bibkey:
Cite (ACL):
Bingzhi Li, Guillaume Wisniewski, and Benoit Crabbé. 2021. Are Transformers a Modern Version of ELIZA? Observations on French Object Verb Agreement. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 4599–4610, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Are Transformers a Modern Version of ELIZA? Observations on French Object Verb Agreement (Li et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.377.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.377.mp4