Presentations are not always linear! GNN meets LLM for Text Document-to-Presentation Transformation with Attribution

Himanshu Maheshwari, Sambaran Bandyopadhyay, Aparna Garimella, Anandhavelu Natarajan


Abstract
Automatically generating a presentation from the text of a long document is a challenging and useful problem. In contrast to a flat summary, a presentation needs to have a better and non-linear narrative, i.e., the content of a slide can come from different and non-contiguous parts of the given document. However, it is difficult to incorporate such non-linear mapping of content to slides and ensure that the content is faithful to the document. LLMs are prone to hallucination and their performance degrades with the length of the input document. Towards this, we propose a novel graph based solution where we learn a graph from the input document and use a combination of graph neural network and LLM to generate a presentation with attribution of content for each slide. We conduct thorough experiments to show the merit of our approach compared to directly using LLMs for this task.
Anthology ID:
2024.findings-emnlp.936
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15948–15962
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.936
DOI:
10.18653/v1/2024.findings-emnlp.936
Bibkey:
Cite (ACL):
Himanshu Maheshwari, Sambaran Bandyopadhyay, Aparna Garimella, and Anandhavelu Natarajan. 2024. Presentations are not always linear! GNN meets LLM for Text Document-to-Presentation Transformation with Attribution. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 15948–15962, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Presentations are not always linear! GNN meets LLM for Text Document-to-Presentation Transformation with Attribution (Maheshwari et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.936.pdf