AMR Parsing with Action-Pointer Transformer

Jiawei Zhou, Tahira Naseem, Ramón Fernandez Astudillo, Radu Florian


Abstract
Abstract Meaning Representation parsing is a sentence-to-graph prediction task where target nodes are not explicitly aligned to sentence tokens. However, since graph nodes are semantically based on one or more sentence tokens, implicit alignments can be derived. Transition-based parsers operate over the sentence from left to right, capturing this inductive bias via alignments at the cost of limited expressiveness. In this work, we propose a transition-based system that combines hard-attention over sentences with a target-side action pointer mechanism to decouple source tokens from node representations and address alignments. We model the transitions as well as the pointer mechanism through straightforward modifications within a single Transformer architecture. Parser state and graph structure information are efficiently encoded using attention heads. We show that our action-pointer approach leads to increased expressiveness and attains large gains (+1.6 points) against the best transition-based AMR parser in very similar conditions. While using no graph re-categorization, our single model yields the second best Smatch score on AMR 2.0 (81.8), which is further improved to 83.4 with silver data and ensemble decoding.
Anthology ID:
2021.naacl-main.443
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5585–5598
Language:
URL:
https://aclanthology.org/2021.naacl-main.443
DOI:
10.18653/v1/2021.naacl-main.443
Bibkey:
Cite (ACL):
Jiawei Zhou, Tahira Naseem, Ramón Fernandez Astudillo, and Radu Florian. 2021. AMR Parsing with Action-Pointer Transformer. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 5585–5598, Online. Association for Computational Linguistics.
Cite (Informal):
AMR Parsing with Action-Pointer Transformer (Zhou et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.443.pdf
Video:
 https://aclanthology.org/2021.naacl-main.443.mp4
Data
LDC2017T10LDC2020T02