%0 Conference Proceedings %T Exploring Cross-Lingual Transfer of Morphological Knowledge In Sequence-to-Sequence Models %A Jin, Huiming %A Kann, Katharina %Y Faruqui, Manaal %Y Schuetze, Hinrich %Y Trancoso, Isabel %Y Yaghoobzadeh, Yadollah %S Proceedings of the First Workshop on Subword and Character Level Models in NLP %D 2017 %8 September %I Association for Computational Linguistics %C Copenhagen, Denmark %F jin-kann-2017-exploring %X Multi-task training is an effective method to mitigate the data sparsity problem. It has recently been applied for cross-lingual transfer learning for paradigm completion—the task of producing inflected forms of lemmata—with sequence-to-sequence networks. However, it is still vague how the model transfers knowledge across languages, as well as if and which information is shared. To investigate this, we propose a set of data-dependent experiments using an existing encoder-decoder recurrent neural network for the task. Our results show that indeed the performance gains surpass a pure regularization effect and that knowledge about language and morphology can be transferred. %R 10.18653/v1/W17-4110 %U https://aclanthology.org/W17-4110 %U https://doi.org/10.18653/v1/W17-4110 %P 70-75