Bi-Directional Multi-Granularity Generation Framework for Knowledge Graph-to-Text with Large Language Model

Haowei Du, Chen Li, Dinghao Zhang, Dongyan Zhao


Abstract
The knowledge graph-to-text (KG-to-text) generation task aims to synthesize coherent and engaging sentences that accurately convey the complex information derived from an input knowledge graph. Existing methods generate the whole target text based on all KG triples at once and may incorporate incorrect KG triples for each sentence. To this end, we propose the bi-directional multi-granularity generation framework. Instead of generating the whole text at a time, we construct the sentence level generation based on the corresponding triples and generate the graph-level text as a result. Moreover, we design a backward relation extraction task to enhance the correctness of relational information. Our method achieves the new state-of-the-art in benchmark dataset WebNLG and further analysis shows the efficiency of different modules.
Anthology ID:
2024.acl-short.14
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
147–152
Language:
URL:
https://aclanthology.org/2024.acl-short.14
DOI:
Bibkey:
Cite (ACL):
Haowei Du, Chen Li, Dinghao Zhang, and Dongyan Zhao. 2024. Bi-Directional Multi-Granularity Generation Framework for Knowledge Graph-to-Text with Large Language Model. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 147–152, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Bi-Directional Multi-Granularity Generation Framework for Knowledge Graph-to-Text with Large Language Model (Du et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-short.14.pdf