Do transformer models do phonology like a linguist?

Saliha Muradoglu, Mans Hulden


Abstract
Neural sequence-to-sequence models have been very successful at tasks in phonology and morphology that seemingly require a capacity for intricate linguistic generalisations. In this paper, we perform a detailed breakdown of the power of such models to capture various phonological generalisations and to benefit from exposure to one phonological rule to infer the behaviour of another similar rule. We present two types of experiments, one of which establishes the efficacy of the transformer model on 29 different processes. The second experiment type follows a priming and held-out case split where our model is exposed to two (or more) phenomena; one which is used as a primer to make the model aware of a linguistic category (e.g. voiceless stops) and a second one which contains a rule with a withheld case that the model is expected to infer (e.g. word-final devoicing with a missing training example such as b→p) results show that the transformer model can successfully model all 29 phonological phenomena considered, regardless of perceived process difficulty. We also show that the model can generalise linguistic categories and structures, such as vowels and syllables, through priming processes.
Anthology ID:
2023.findings-acl.541
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8529–8537
Language:
URL:
https://aclanthology.org/2023.findings-acl.541
DOI:
10.18653/v1/2023.findings-acl.541
Bibkey:
Cite (ACL):
Saliha Muradoglu and Mans Hulden. 2023. Do transformer models do phonology like a linguist?. In Findings of the Association for Computational Linguistics: ACL 2023, pages 8529–8537, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Do transformer models do phonology like a linguist? (Muradoglu & Hulden, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.541.pdf