Li Si


2024

pdf bib
Linguistic Guidance for Sequence-to-Sequence AMR Parsing
Tang Binghao | Lin Boda | Li Si
Proceedings of the 23rd Chinese National Conference on Computational Linguistics (Volume 1: Main Conference)

“The Abstract Meaning Representation (AMR) parsing aims at capturing the meaning of a sen-tence in the form of an AMR graph. Sequence-to-sequence (seq2seq)-based methods, utilizingpowerful Encoder-Decoder pre-trained language models (PLMs), have shown promising perfor-mance. Subsequent works have further improved the utilization of AMR graph information forseq2seq models. However, seq2seq models generate output sequence incrementally, and inac-curate subsequence at the beginning can negatively impact final outputs, also the interconnec-tion between other linguistic representation formats and AMR remains an underexplored domainin existing research. To mitigate the issue of error propagation and to investigate the guidinginfluence of other representation formats on PLMs, we propose a novel approach of LinguisticGuidance for Seq2seq AMR parsing (LGSA). Our proposed LGSA incorporates the very limitedinformation of various linguistic representation formats as guidance on the Encoder side, whichcan effectively enhance PLMs to their further potential, and boost AMR parsing. The resultson proverbial benchmark AMR2.0 and AMR3.0 demonstrate the efficacy of LGSA, which canimprove seq2seq AMR parsers without silver AMR data or alignment information. Moreover,we evaluate the generalization of LGSA by conducting experiments on out-of-domain datasets,and the results indicate that LGSA is even effective in such challenging scenarios.”