Spelling convention sensitivity in neural language models

Elizabeth Nielsen, Christo Kirov, Brian Roark


Abstract
We examine whether large neural language models, trained on very large collections of varied English text, learn the potentially long-distance dependency of British versus American spelling conventions, i.e., whether spelling is consistently one or the other within model-generated strings. In contrast to long-distance dependencies in non-surface underlying structure (e.g., syntax), spelling consistency is easier to measure both in LMs and the text corpora used to train them, which can provide additional insight into certain observed model behaviors. Using a set of probe words unique to either British or American English, we first establish that training corpora exhibit substantial (though not total) consistency. A large T5 language model does appear to internalize this consistency, though only with respect to observed lexical items (not nonce words with British/American spelling patterns). We further experiment with correcting for biases in the training data by fine-tuning T5 on synthetic data that has been debiased, and find that finetuned T5 remains only somewhat sensitive to spelling consistency. Further experiments show GPT2 to be similarly limited.
Anthology ID:
2023.findings-eacl.98
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1334–1346
Language:
URL:
https://aclanthology.org/2023.findings-eacl.98
DOI:
10.18653/v1/2023.findings-eacl.98
Bibkey:
Cite (ACL):
Elizabeth Nielsen, Christo Kirov, and Brian Roark. 2023. Spelling convention sensitivity in neural language models. In Findings of the Association for Computational Linguistics: EACL 2023, pages 1334–1346, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Spelling convention sensitivity in neural language models (Nielsen et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.98.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.98.mp4