Stacked AMR Parsing with Silver Data

Qingrong Xia, Zhenghua Li, Rui Wang, Min Zhang


Abstract
Lacking sufficient human-annotated data is one main challenge for abstract meaning representation (AMR) parsing. To alleviate this problem, previous works usually make use of silver data or pre-trained language models. In particular, one recent seq-to-seq work directly fine-tunes AMR graph sequences on the encoder-decoder pre-trained language model and achieves new state-of-the-art results, outperforming previous works by a large margin. However, it makes the decoding relatively slower. In this work, we investigate alternative approaches to achieve competitive performance at faster speeds. We propose a simplified AMR parser and a pre-training technique for the effective usage of silver data. We conduct extensive experiments on the widely used AMR2.0 dataset and the results demonstrate that our Transformer-based AMR parser achieves the best performance among the seq2graph-based models. Furthermore, with silver data, our model achieves competitive results with the SOTA model, and the speed is an order of magnitude faster. Detailed analyses are conducted to gain more insights into our proposed model and the effectiveness of the pre-training technique.
Anthology ID:
2021.findings-emnlp.406
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4729–4738
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.406
DOI:
10.18653/v1/2021.findings-emnlp.406
Bibkey:
Cite (ACL):
Qingrong Xia, Zhenghua Li, Rui Wang, and Min Zhang. 2021. Stacked AMR Parsing with Silver Data. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 4729–4738, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Stacked AMR Parsing with Silver Data (Xia et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.406.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.406.mp4
Code
 kirosummer/amr