A Comprehensive Comparison of Neural Networks as Cognitive Models of Inflection

Adam Wiemerslage, Shiran Dudy, Katharina Kann


Abstract
Neural networks have long been at the center of a debate around the cognitive mechanism by which humans process inflectional morphology. This debate has gravitated into NLP by way of the question: Are neural networks a feasible account for human behavior in morphological inflection?We address that question by measuring the correlation between human judgments and neural network probabilities for unknown word inflections. We test a larger range of architectures than previously studied on two important tasks for the cognitive processing debate: English past tense, and German number inflection. We find evidence that the Transformer may be a better account of human behavior than LSTMs on these datasets, and that LSTM features known to increase inflection accuracy do not always result in more human-like behavior.
Anthology ID:
2022.emnlp-main.126
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1933–1945
Language:
URL:
https://aclanthology.org/2022.emnlp-main.126
DOI:
10.18653/v1/2022.emnlp-main.126
Bibkey:
Cite (ACL):
Adam Wiemerslage, Shiran Dudy, and Katharina Kann. 2022. A Comprehensive Comparison of Neural Networks as Cognitive Models of Inflection. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 1933–1945, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
A Comprehensive Comparison of Neural Networks as Cognitive Models of Inflection (Wiemerslage et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.126.pdf