Cross-lingual Transfer Learning and Multitask Learning for Capturing Multiword Expressions

Shiva Taslimipoor, Omid Rohanian, Le An Ha


Abstract
Recent developments in deep learning have prompted a surge of interest in the application of multitask and transfer learning to NLP problems. In this study, we explore for the first time, the application of transfer learning (TRL) and multitask learning (MTL) to the identification of Multiword Expressions (MWEs). For MTL, we exploit the shared syntactic information between MWE and dependency parsing models to jointly train a single model on both tasks. We specifically predict two types of labels: MWE and dependency parse. Our neural MTL architecture utilises the supervision of dependency parsing in lower layers and predicts MWE tags in upper layers. In the TRL scenario, we overcome the scarcity of data by learning a model on a larger MWE dataset and transferring the knowledge to a resource-poor setting in another language. In both scenarios, the resulting models achieved higher performance compared to standard neural approaches.
Anthology ID:
W19-5119
Volume:
Proceedings of the Joint Workshop on Multiword Expressions and WordNet (MWE-WN 2019)
Month:
August
Year:
2019
Address:
Florence, Italy
Venue:
MWE
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
155–161
Language:
URL:
https://aclanthology.org/W19-5119
DOI:
10.18653/v1/W19-5119
Bibkey:
Cite (ACL):
Shiva Taslimipoor, Omid Rohanian, and Le An Ha. 2019. Cross-lingual Transfer Learning and Multitask Learning for Capturing Multiword Expressions. In Proceedings of the Joint Workshop on Multiword Expressions and WordNet (MWE-WN 2019), pages 155–161, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Cross-lingual Transfer Learning and Multitask Learning for Capturing Multiword Expressions (Taslimipoor et al., MWE 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-5119.pdf