Hierarchical Curriculum Learning for AMR Parsing

Peiyi Wang, Liang Chen, Tianyu Liu, Damai Dai, Yunbo Cao, Baobao Chang, Zhifang Sui


Abstract
Abstract Meaning Representation (AMR) parsing aims to translate sentences to semantic representation with a hierarchical structure, and is recently empowered by pretrained sequence-to-sequence models. However, there exists a gap between their flat training objective (i.e., equally treats all output tokens) and the hierarchical AMR structure, which limits the model generalization. To bridge this gap, we propose a Hierarchical Curriculum Learning (HCL) framework with Structure-level (SC) and Instance-level Curricula (IC). SC switches progressively from core to detail AMR semantic elements while IC transits from structure-simple to -complex AMR instances during training. Through these two warming-up processes, HCL reduces the difficulty of learning complex structures, thus the flat model can better adapt to the AMR hierarchy. Extensive experiments on AMR2.0, AMR3.0, structure-complex and out-of-distribution situations verify the effectiveness of HCL.
Anthology ID:
2022.acl-short.37
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
333–339
Language:
URL:
https://aclanthology.org/2022.acl-short.37
DOI:
10.18653/v1/2022.acl-short.37
Bibkey:
Cite (ACL):
Peiyi Wang, Liang Chen, Tianyu Liu, Damai Dai, Yunbo Cao, Baobao Chang, and Zhifang Sui. 2022. Hierarchical Curriculum Learning for AMR Parsing. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 333–339, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Hierarchical Curriculum Learning for AMR Parsing (Wang et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-short.37.pdf
Software:
 2022.acl-short.37.software.zip
Code
 wangpeiyi9979/hcl-text2amr
Data
Bio