NUIG: Multitasking Self-attention based approach to SigTyp 2020 Shared Task

Chinmay Choudhary


Abstract
The paper describes the Multitasking Self-attention based approach to constrained sub-task within Sigtyp 2020 Shared task. Our model is simple neural network based architecture inspired by Transformers (CITATION) model. The model uses Multitasking to compute values of all WALS features for a given input language simultaneously.Results show that our approach performs at par with the baseline approaches, even though our proposed approach requires only phylogenetic and geographical attributes namely Longitude, Latitude, Genus-index, Family-index and Country-index and do not use any of the known WALS features of the respective input language, to compute its missing WALS features.
Anthology ID:
2020.sigtyp-1.6
Volume:
Proceedings of the Second Workshop on Computational Research in Linguistic Typology
Month:
November
Year:
2020
Address:
Online
Editors:
Ekaterina Vylomova, Edoardo M. Ponti, Eitan Grossman, Arya D. McCarthy, Yevgeni Berzak, Haim Dubossarsky, Ivan Vulić, Roi Reichart, Anna Korhonen, Ryan Cotterell
Venue:
SIGTYP
SIG:
SIGTYP
Publisher:
Association for Computational Linguistics
Note:
Pages:
43–50
Language:
URL:
https://aclanthology.org/2020.sigtyp-1.6
DOI:
10.18653/v1/2020.sigtyp-1.6
Bibkey:
Cite (ACL):
Chinmay Choudhary. 2020. NUIG: Multitasking Self-attention based approach to SigTyp 2020 Shared Task. In Proceedings of the Second Workshop on Computational Research in Linguistic Typology, pages 43–50, Online. Association for Computational Linguistics.
Cite (Informal):
NUIG: Multitasking Self-attention based approach to SigTyp 2020 Shared Task (Choudhary, SIGTYP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.sigtyp-1.6.pdf
Video:
 https://slideslive.com/38939794