Cross-domain Generalization for AMR Parsing

Xuefeng Bai, Sen Yang, Leyang Cui, Linfeng Song, Yue Zhang


Abstract
Abstract Meaning Representation (AMR) parsing aims to predict an AMR graph from textual input. Recently, there has been notable growth in AMR parsing performance. However, most existing work focuses on improving the performance in the specific domain, ignoring the potential domain dependence of AMR parsing systems. To address this, we extensively evaluate five representative AMR parsers on five domains and analyze challenges to cross-domain AMR parsing. We observe that challenges to cross-domain AMR parsing mainly arise from the distribution shift of words and AMR concepts. Based on our observation, we investigate two approaches to reduce the domain distribution divergence of text and AMR features, respectively. Experimental results on two out-of-domain test sets show the superiority of our method.
Anthology ID:
2022.emnlp-main.749
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10907–10921
Language:
URL:
https://aclanthology.org/2022.emnlp-main.749
DOI:
10.18653/v1/2022.emnlp-main.749
Bibkey:
Cite (ACL):
Xuefeng Bai, Sen Yang, Leyang Cui, Linfeng Song, and Yue Zhang. 2022. Cross-domain Generalization for AMR Parsing. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 10907–10921, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Cross-domain Generalization for AMR Parsing (Bai et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.749.pdf