Sequence-to-sequence AMR Parsing with Ancestor Information

Chen Yu, Daniel Gildea


Abstract
AMR parsing is the task that maps a sentence to an AMR semantic graph automatically. The difficulty comes from generating the complex graph structure. The previous state-of-the-art method translates the AMR graph into a sequence, then directly fine-tunes a pretrained sequence-to-sequence Transformer model (BART). However, purely treating the graph as a sequence does not take advantage of structural information about the graph. In this paper, we design several strategies to add the important ancestor information into the Transformer Decoder. Our experiments show that we can improve the performance for both AMR 2.0 and AMR 3.0 dataset and achieve new state-of-the-art results.
Anthology ID:
2022.acl-short.63
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
571–577
Language:
URL:
https://aclanthology.org/2022.acl-short.63
DOI:
10.18653/v1/2022.acl-short.63
Bibkey:
Cite (ACL):
Chen Yu and Daniel Gildea. 2022. Sequence-to-sequence AMR Parsing with Ancestor Information. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 571–577, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Sequence-to-sequence AMR Parsing with Ancestor Information (Yu & Gildea, ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-short.63.pdf