Guiding Large Language Models via External Attention Prompting for Scientific Extreme Summarization

Yuan Chang, Ziyue Li, Xiaoqiu Le


Abstract
Scientific extreme summarization, the task of generating concise one-sentence summaries (TLDRs) for scientific papers, presents significant challenges due to the need for deep domain-specific understanding and the ability to distill salient information. This study identifies the critical role of titles and keywords in enhancing TLDR generation through quantitative analysis. We propose a novel method, External Attention Prompting (EAP), which leverages LLMs by guiding them to focus on the most critical parts of the source text through varying degrees of attention signals. Our method employs Markdown emphasis syntax to annotate attention levels, enabling LLMs to prioritize salient information effectively. Extensive experiments demonstrate that EAP significantly outperforms baseline methods across various LLMs and metrics in both zero-shot and few-shot settings. Further evaluations by GPT-4 demonstrate that EAP can enable LLMs to generate TLDRs of higher human-aligned quality.
Anthology ID:
2024.sdp-1.22
Volume:
Proceedings of the Fourth Workshop on Scholarly Document Processing (SDP 2024)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Tirthankar Ghosal, Amanpreet Singh, Anita Waard, Philipp Mayr, Aakanksha Naik, Orion Weller, Yoonjoo Lee, Shannon Shen, Yanxia Qin
Venues:
sdp | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
226–242
Language:
URL:
https://aclanthology.org/2024.sdp-1.22
DOI:
Bibkey:
Cite (ACL):
Yuan Chang, Ziyue Li, and Xiaoqiu Le. 2024. Guiding Large Language Models via External Attention Prompting for Scientific Extreme Summarization. In Proceedings of the Fourth Workshop on Scholarly Document Processing (SDP 2024), pages 226–242, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Guiding Large Language Models via External Attention Prompting for Scientific Extreme Summarization (Chang et al., sdp-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.sdp-1.22.pdf