On the Benefit of Syntactic Supervision for Cross-lingual Transfer in Semantic Role Labeling

Zhisong Zhang, Emma Strubell, Eduard Hovy


Abstract
Although recent developments in neural architectures and pre-trained representations have greatly increased state-of-the-art model performance on fully-supervised semantic role labeling (SRL), the task remains challenging for languages where supervised SRL training data are not abundant. Cross-lingual learning can improve performance in this setting by transferring knowledge from high-resource languages to low-resource ones. Moreover, we hypothesize that annotations of syntactic dependencies can be leveraged to further facilitate cross-lingual transfer. In this work, we perform an empirical exploration of the helpfulness of syntactic supervision for crosslingual SRL within a simple multitask learning scheme. With comprehensive evaluations across ten languages (in addition to English) and three SRL benchmark datasets, including both dependency- and span-based SRL, we show the effectiveness of syntactic supervision in low-resource scenarios.
Anthology ID:
2021.emnlp-main.503
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6229–6246
Language:
URL:
https://aclanthology.org/2021.emnlp-main.503
DOI:
10.18653/v1/2021.emnlp-main.503
Bibkey:
Cite (ACL):
Zhisong Zhang, Emma Strubell, and Eduard Hovy. 2021. On the Benefit of Syntactic Supervision for Cross-lingual Transfer in Semantic Role Labeling. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 6229–6246, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
On the Benefit of Syntactic Supervision for Cross-lingual Transfer in Semantic Role Labeling (Zhang et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.503.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.503.mp4
Code
 zzsfornlp/zmsp
Data
OntoNotes 5.0Universal Dependencies