One-Size-Fits-All Multilingual Models

Ben Peters, André F. T. Martins


Abstract
This paper presents DeepSPIN’s submissions to Tasks 0 and 1 of the SIGMORPHON 2020 Shared Task. For both tasks, we present multilingual models, training jointly on data in all languages. We perform no language-specific hyperparameter tuning – each of our submissions uses the same model for all languages. Our basic architecture is the sparse sequence-to-sequence model with entmax attention and loss, which allows our models to learn sparse, local alignments while still being trainable with gradient-based techniques. For Task 1, we achieve strong performance with both RNN- and transformer-based sparse models. For Task 0, we extend our RNN-based model to a multi-encoder set-up in which separate modules encode the lemma and inflection sequences. Despite our models’ lack of language-specific tuning, they tie for first in Task 0 and place third in Task 1.
Anthology ID:
2020.sigmorphon-1.4
Volume:
Proceedings of the 17th SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and Morphology
Month:
July
Year:
2020
Address:
Online
Editors:
Garrett Nicolai, Kyle Gorman, Ryan Cotterell
Venue:
SIGMORPHON
SIG:
SIGMORPHON
Publisher:
Association for Computational Linguistics
Note:
Pages:
63–69
Language:
URL:
https://aclanthology.org/2020.sigmorphon-1.4
DOI:
10.18653/v1/2020.sigmorphon-1.4
Bibkey:
Cite (ACL):
Ben Peters and André F. T. Martins. 2020. One-Size-Fits-All Multilingual Models. In Proceedings of the 17th SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and Morphology, pages 63–69, Online. Association for Computational Linguistics.
Cite (Informal):
One-Size-Fits-All Multilingual Models (Peters & Martins, SIGMORPHON 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.sigmorphon-1.4.pdf