Comparing Statistical and Neural Models for Learning Sound Correspondences

Clémentine Fourrier, Benoît Sagot


Abstract
Cognate prediction and proto-form reconstruction are key tasks in computational historical linguistics that rely on the study of sound change regularity. Solving these tasks appears to be very similar to machine translation, though methods from that field have barely been applied to historical linguistics. Therefore, in this paper, we investigate the learnability of sound correspondences between a proto-language and daughter languages for two machine-translation-inspired models, one statistical, the other neural. We first carry out our experiments on plausible artificial languages, without noise, in order to study the role of each parameter on the algorithms respective performance under almost perfect conditions. We then study real languages, namely Latin, Italian and Spanish, to see if those performances generalise well. We show that both model types manage to learn sound changes despite data scarcity, although the best performing model type depends on several parameters such as the size of the training data, the ambiguity, and the prediction direction.
Anthology ID:
2020.lt4hala-1.12
Volume:
Proceedings of LT4HALA 2020 - 1st Workshop on Language Technologies for Historical and Ancient Languages
Month:
May
Year:
2020
Address:
Marseille, France
Editors:
Rachele Sprugnoli, Marco Passarotti
Venue:
LT4HALA
SIG:
Publisher:
European Language Resources Association (ELRA)
Note:
Pages:
79–83
Language:
English
URL:
https://aclanthology.org/2020.lt4hala-1.12
DOI:
Bibkey:
Cite (ACL):
Clémentine Fourrier and Benoît Sagot. 2020. Comparing Statistical and Neural Models for Learning Sound Correspondences. In Proceedings of LT4HALA 2020 - 1st Workshop on Language Technologies for Historical and Ancient Languages, pages 79–83, Marseille, France. European Language Resources Association (ELRA).
Cite (Informal):
Comparing Statistical and Neural Models for Learning Sound Correspondences (Fourrier & Sagot, LT4HALA 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.lt4hala-1.12.pdf
Code
 clefourrier/PLexGen
Data
EtymDB 2.0