Modeling Word Forms Using Latent Underlying Morphs and Phonology

Ryan Cotterell, Nanyun Peng, Jason Eisner


Abstract
The observed pronunciations or spellings of words are often explained as arising from the “underlying forms” of their morphemes. These forms are latent strings that linguists try to reconstruct by hand. We propose to reconstruct them automatically at scale, enabling generalization to new words. Given some surface word types of a concatenative language along with the abstract morpheme sequences that they express, we show how to recover consistent underlying forms for these morphemes, together with the (stochastic) phonology that maps each concatenation of underlying forms to a surface form. Our technique involves loopy belief propagation in a natural directed graphical model whose variables are unknown strings and whose conditional distributions are encoded as finite-state machines with trainable weights. We define training and evaluation paradigms for the task of surface word prediction, and report results on subsets of 7 languages.
Anthology ID:
Q15-1031
Volume:
Transactions of the Association for Computational Linguistics, Volume 3
Month:
Year:
2015
Address:
Cambridge, MA
Editors:
Michael Collins, Lillian Lee
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
433–447
Language:
URL:
https://aclanthology.org/Q15-1031
DOI:
10.1162/tacl_a_00149
Bibkey:
Cite (ACL):
Ryan Cotterell, Nanyun Peng, and Jason Eisner. 2015. Modeling Word Forms Using Latent Underlying Morphs and Phonology. Transactions of the Association for Computational Linguistics, 3:433–447.
Cite (Informal):
Modeling Word Forms Using Latent Underlying Morphs and Phonology (Cotterell et al., TACL 2015)
Copy Citation:
PDF:
https://aclanthology.org/Q15-1031.pdf
Data
CELEX