%0 Journal Article %T Universal Discourse Representation Structure Parsing %A Liu, Jiangming %A Cohen, Shay B. %A Lapata, Mirella %A Bos, Johan %J Computational Linguistics %D 2021 %8 June %V 47 %N 2 %I MIT Press %C Cambridge, MA %F liu-etal-2021-universal %X We consider the task of crosslingual semantic parsing in the style of Discourse Representation Theory (DRT) where knowledge from annotated corpora in a resource-rich language is transferred via bitext to guide learning in other languages. We introduce 𝕌niversal Discourse Representation Theory (𝕌DRT), a variant of DRT that explicitly anchors semantic representations to tokens in the linguistic input. We develop a semantic parsing framework based on the Transformer architecture and utilize it to obtain semantic resources in multiple languages following two learning schemes. The many-to-one approach translates non-English text to English, and then runs a relatively accurate English parser on the translated text, while the one-to-many approach translates gold standard English to non-English text and trains multiple parsers (one per language) on the translations. Experimental results on the Parallel Meaning Bank show that our proposal outperforms strong baselines by a wide margin and can be used to construct (silver-standard) meaning banks for 99 languages. %R 10.1162/coli_a_00406 %U https://aclanthology.org/2021.cl-2.15 %U https://doi.org/10.1162/coli_a_00406 %P 445-476