A Two-stage Generative Chinese AMR Parsing Method Based on Large Language Models

Shen Zizhuo, Shao Yanqiu, Li Wei


Abstract
“The purpose of the CAMR task is to convert natural language into a formalized semantic representation in the form of a graph structure. Due to the complexity of the AMR graph structure, traditional AMR automatic parsing methods often require the design of complex models and strategies. Thanks to the powerful generative capabilities of LLMs, adopting an autore-gressive generative approach for AMR parsing has many advantages such as simple modeling and strong extensibility. To further explore the generative AMR automatic parsing technology based on LLMs, we design a two-stage AMR automatic parsing method based on LLMs in this CAMR evaluation. Specifically, we design two pipeline subtasks of alignment-aware node generation and relationship-aware node generation to reduce the difficulty of LLM understanding and generation. Additionally, to boost the system’s transferability, we incorporate a retrieval-augmented strategy during both training and inference phases. The experimental results show that the method we proposed has achieved promising results in this evaluation.”
Anthology ID:
2024.ccl-3.17
Volume:
Proceedings of the 23rd Chinese National Conference on Computational Linguistics (Volume 3: Evaluations)
Month:
July
Year:
2024
Address:
Taiyuan, China
Editors:
Hongfei Lin, Hongye Tan, Bin Li
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
154–159
Language:
English
URL:
https://aclanthology.org/2024.ccl-3.17/
DOI:
Bibkey:
Cite (ACL):
Shen Zizhuo, Shao Yanqiu, and Li Wei. 2024. A Two-stage Generative Chinese AMR Parsing Method Based on Large Language Models. In Proceedings of the 23rd Chinese National Conference on Computational Linguistics (Volume 3: Evaluations), pages 154–159, Taiyuan, China. Chinese Information Processing Society of China.
Cite (Informal):
A Two-stage Generative Chinese AMR Parsing Method Based on Large Language Models (Zizhuo et al., CCL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.ccl-3.17.pdf