Rachel Sidney Devianti


2024

pdf bib
Transferability of Syntax-Aware Graph Neural Networks in Zero-Shot Cross-Lingual Semantic Role Labeling
Rachel Sidney Devianti | Yusuke Miyao
Findings of the Association for Computational Linguistics: EMNLP 2024

Recent models in cross-lingual semantic role labeling (SRL) barely analyze the applicability of their network selection.We believe that network selection is important since it affects the transferability of cross-lingual models, i.e., how the model can extract universal features from source languages to label target languages.Therefore, we comprehensively compare the transferability of different graph neural network (GNN)-based models enriched with universal dependency trees.GNN-based models include transformer-based, graph convolutional network-based, and graph attention network (GAT)-based models.We focus our study on a zero-shot setting by training the models in English and evaluating the models in 23 target languages provided by the Universal Proposition Bank.Based on our experiments, we consistently show that syntax from universal dependency trees is essential for cross-lingual SRL models to achieve better transferability.Dependency-aware self-attention with relative position representations (SAN-RPRs) transfer best across languages, especially in the long-range dependency distance.We also show that dependency-aware two-attention relational GATs transfer better than SAN-RPRs in languages where most arguments lie in a 1-2 dependency distance.
Search
Co-authors
Venues