Universal Discourse Representation Structure Parsing

Jiangming Liu, Shay B. Cohen, Mirella Lapata, Johan Bos


Abstract
We consider the task of crosslingual semantic parsing in the style of Discourse Representation Theory (DRT) where knowledge from annotated corpora in a resource-rich language is transferred via bitext to guide learning in other languages. We introduce đť•Śniversal Discourse Representation Theory (đť•ŚDRT), a variant of DRT that explicitly anchors semantic representations to tokens in the linguistic input. We develop a semantic parsing framework based on the Transformer architecture and utilize it to obtain semantic resources in multiple languages following two learning schemes. The many-to-one approach translates non-English text to English, and then runs a relatively accurate English parser on the translated text, while the one-to-many approach translates gold standard English to non-English text and trains multiple parsers (one per language) on the translations. Experimental results on the Parallel Meaning Bank show that our proposal outperforms strong baselines by a wide margin and can be used to construct (silver-standard) meaning banks for 99 languages.
Anthology ID:
2021.cl-2.15
Volume:
Computational Linguistics, Volume 47, Issue 2 - June 2021
Month:
June
Year:
2021
Address:
Cambridge, MA
Venue:
CL
SIG:
Publisher:
MIT Press
Note:
Pages:
445–476
Language:
URL:
https://aclanthology.org/2021.cl-2.15
DOI:
10.1162/coli_a_00406
Bibkey:
Cite (ACL):
Jiangming Liu, Shay B. Cohen, Mirella Lapata, and Johan Bos. 2021. Universal Discourse Representation Structure Parsing. Computational Linguistics, 47(2):445–476.
Cite (Informal):
Universal Discourse Representation Structure Parsing (Liu et al., CL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.cl-2.15.pdf
Data
Groningen Meaning Bank