Alignment-free Cross-lingual Semantic Role Labeling

Rui Cai, Mirella Lapata


Abstract
Cross-lingual semantic role labeling (SRL) aims at leveraging resources in a source language to minimize the effort required to construct annotations or models for a new target language. Recent approaches rely on word alignments, machine translation engines, or preprocessing tools such as parsers or taggers. We propose a cross-lingual SRL model which only requires annotations in a source language and access to raw text in the form of a parallel corpus. The backbone of our model is an LSTM-based semantic role labeler jointly trained with a semantic role compressor and multilingual word embeddings. The compressor collects useful information from the output of the semantic role labeler, filtering noisy and conflicting evidence. It lives in a multilingual embedding space and provides direct supervision for predicting semantic roles in the target language. Results on the Universal Proposition Bank and manually annotated datasets show that our method is highly effective, even against systems utilizing supervised features.
Anthology ID:
2020.emnlp-main.319
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3883–3894
Language:
URL:
https://aclanthology.org/2020.emnlp-main.319
DOI:
10.18653/v1/2020.emnlp-main.319
Bibkey:
Cite (ACL):
Rui Cai and Mirella Lapata. 2020. Alignment-free Cross-lingual Semantic Role Labeling. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 3883–3894, Online. Association for Computational Linguistics.
Cite (Informal):
Alignment-free Cross-lingual Semantic Role Labeling (Cai & Lapata, EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.319.pdf
Video:
 https://slideslive.com/38938837