Online Back-Parsing for AMR-to-Text Generation

Xuefeng Bai, Linfeng Song, Yue Zhang


Abstract
AMR-to-text generation aims to recover a text containing the same meaning as an input AMR graph. Current research develops increasingly powerful graph encoders to better represent AMR graphs, with decoders based on standard language modeling being used to generate outputs. We propose a decoder that back predicts projected AMR graphs on the target sentence during text generation. As the result, our outputs can better preserve the input meaning than standard decoders. Experiments on two AMR benchmarks show the superiority of our model over the previous state-of-the-art system based on graph Transformer.
Anthology ID:
2020.emnlp-main.92
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1206–1219
Language:
URL:
https://aclanthology.org/2020.emnlp-main.92
DOI:
10.18653/v1/2020.emnlp-main.92
Bibkey:
Cite (ACL):
Xuefeng Bai, Linfeng Song, and Yue Zhang. 2020. Online Back-Parsing for AMR-to-Text Generation. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 1206–1219, Online. Association for Computational Linguistics.
Cite (Informal):
Online Back-Parsing for AMR-to-Text Generation (Bai et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.92.pdf
Video:
 https://slideslive.com/38939115
Code
 muyeby/AMR-Backparsing
Data
LDC2017T10