BiBL: AMR Parsing and Generation with Bidirectional Bayesian Learning

Ziming Cheng, Zuchao Li, Hai Zhao


Abstract
Abstract Meaning Representation (AMR) offers a unified semantic representation for natural language sentences. Thus transformation between AMR and text yields two transition tasks in opposite directions, i.e., Text-to-AMR parsing and AMR-to-Text generation. Existing AMR studies only focus on one-side improvements despite the duality of the two tasks, and their improvements are greatly attributed to the inclusion of large extra training data or complex structure modifications which harm the inference speed. Instead, we propose data-efficient Bidirectional Bayesian learning (BiBL) to facilitate bidirectional information transition by adopting a single-stage multitasking strategy so that the resulting model may enjoy much lighter training at the same time. Evaluation on benchmark datasets shows that our proposed BiBL outperforms strong previous seq2seq refinements without the help of extra data which is indispensable in existing counterpart models. We release the codes of BiBL at: https://github.com/KHAKhazeus/BiBL.
Anthology ID:
2022.coling-1.485
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5461–5475
Language:
URL:
https://aclanthology.org/2022.coling-1.485
DOI:
Bibkey:
Cite (ACL):
Ziming Cheng, Zuchao Li, and Hai Zhao. 2022. BiBL: AMR Parsing and Generation with Bidirectional Bayesian Learning. In Proceedings of the 29th International Conference on Computational Linguistics, pages 5461–5475, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
BiBL: AMR Parsing and Generation with Bidirectional Bayesian Learning (Cheng et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.485.pdf
Code
 khakhazeus/bibl
Data
BioNew3