Translate, then Parse! A Strong Baseline for Cross-Lingual AMR Parsing

Sarah Uhrig, Yoalli Garcia, Juri Opitz, Anette Frank


Abstract
In cross-lingual Abstract Meaning Representation (AMR) parsing, researchers develop models that project sentences from various languages onto their AMRs to capture their essential semantic structures: given a sentence in any language, we aim to capture its core semantic content through concepts connected by manifold types of semantic relations. Methods typically leverage large silver training data to learn a single model that is able to project non-English sentences to AMRs. However, we find that a simple baseline tends to be overlooked: translating the sentences to English and projecting their AMR with a monolingual AMR parser (translate+parse,T+P). In this paper, we revisit this simple two-step base-line, and enhance it with a strong NMT system and a strong AMR parser. Our experiments show that T+P outperforms a recent state-of-the-art system across all tested languages: German, Italian, Spanish and Mandarin with +14.6, +12.6, +14.3 and +16.0 Smatch points
Anthology ID:
2021.iwpt-1.6
Volume:
Proceedings of the 17th International Conference on Parsing Technologies and the IWPT 2021 Shared Task on Parsing into Enhanced Universal Dependencies (IWPT 2021)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP | IWPT
SIG:
SIGPARSE
Publisher:
Association for Computational Linguistics
Note:
Pages:
58–64
Language:
URL:
https://aclanthology.org/2021.iwpt-1.6
DOI:
10.18653/v1/2021.iwpt-1.6
Bibkey:
Cite (ACL):
Sarah Uhrig, Yoalli Garcia, Juri Opitz, and Anette Frank. 2021. Translate, then Parse! A Strong Baseline for Cross-Lingual AMR Parsing. In Proceedings of the 17th International Conference on Parsing Technologies and the IWPT 2021 Shared Task on Parsing into Enhanced Universal Dependencies (IWPT 2021), pages 58–64, Online. Association for Computational Linguistics.
Cite (Informal):
Translate, then Parse! A Strong Baseline for Cross-Lingual AMR Parsing (Uhrig et al., IWPT 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.iwpt-1.6.pdf
Code
 Heidelberg-NLP/simple-xamr