Slaapte or Sliep? Extending Neural-Network Simulations of English Past Tense Learning to Dutch and German

Xiulin Yang, Jingyan Chen, Arjan van Eerden, Ahnaf Samin, Arianna Bisazza


Abstract
This work studies the plausibility of sequence-to-sequence neural networks as models of morphological acquisition by humans. We replicate the findings of Kirov and Cotterell (2018) on the well-known challenge of the English past tense and examine their generalizability to two related but morphologically richer languages, namely Dutch and German. Using a new dataset of English/Dutch/German (ir)regular verb forms, we show that the major findings of Kirov and Cotterell (2018) hold for all three languages, including the observation of over-regularization errors and micro U-shape learning trajectories. At the same time, we observe troublesome cases of non human-like errors similar to those reported by recent follow-up studies with different languages or neural architectures. Finally, we study the possibility of switching to orthographic input in the absence of pronunciation information and show this can have a non-negligible impact on the simulation results, with possibly misleading findings.
Anthology ID:
2023.nodalida-1.11
Volume:
Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa)
Month:
May
Year:
2023
Address:
Tórshavn, Faroe Islands
Editors:
Tanel Alumäe, Mark Fishel
Venue:
NoDaLiDa
SIG:
Publisher:
University of Tartu Library
Note:
Pages:
92–102
Language:
URL:
https://aclanthology.org/2023.nodalida-1.11
DOI:
Bibkey:
Cite (ACL):
Xiulin Yang, Jingyan Chen, Arjan van Eerden, Ahnaf Samin, and Arianna Bisazza. 2023. Slaapte or Sliep? Extending Neural-Network Simulations of English Past Tense Learning to Dutch and German. In Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa), pages 92–102, Tórshavn, Faroe Islands. University of Tartu Library.
Cite (Informal):
Slaapte or Sliep? Extending Neural-Network Simulations of English Past Tense Learning to Dutch and German (Yang et al., NoDaLiDa 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.nodalida-1.11.pdf