ATP: AMRize Then Parse! Enhancing AMR Parsing with PseudoAMRs

Liang Chen, Peiyi Wang, Runxin Xu, Tianyu Liu, Zhifang Sui, Baobao Chang


Abstract
As Abstract Meaning Representation (AMR) implicitly involves compound semantic annotations, we hypothesize auxiliary tasks which are semantically or formally related can better enhance AMR parsing. We find that 1) Semantic role labeling (SRL) and dependency parsing (DP), would bring more performance gain than other tasks e.g. MT and summarization in the text-to-AMR transition even with much less data. 2) To make a better fit for AMR, data from auxiliary tasks should be properly “AMRized” to PseudoAMR before training. Knowledge from shallow level parsing tasks can be better transferred to AMR Parsing with structure transform. 3) Intermediate-task learning is a better paradigm to introduce auxiliary tasks to AMR parsing, compared to multitask learning. From an empirical perspective, we propose a principled method to involve auxiliary tasks to boost AMR parsing. Extensive experiments show that our method achieves new state-of-the-art performance on different benchmarks especially in topology-related scores. Code and models are released at https://github.com/PKUnlp-icler/ATP.
Anthology ID:
2022.findings-naacl.190
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2482–2496
Language:
URL:
https://aclanthology.org/2022.findings-naacl.190
DOI:
10.18653/v1/2022.findings-naacl.190
Bibkey:
Cite (ACL):
Liang Chen, Peiyi Wang, Runxin Xu, Tianyu Liu, Zhifang Sui, and Baobao Chang. 2022. ATP: AMRize Then Parse! Enhancing AMR Parsing with PseudoAMRs. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 2482–2496, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
ATP: AMRize Then Parse! Enhancing AMR Parsing with PseudoAMRs (Chen et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.190.pdf
Code
 chenllliang/atp +  additional community code
Data
BioLDC2017T10LDC2020T02