Explanation Regeneration via Information Bottleneck

Qintong Li, Zhiyong Wu, Lingpeng Kong, Wei Bi


Abstract
Explaining the black-box predictions of NLP models naturally and accurately is an important open problem in natural language generation. These free-text explanations are expected to contain sufficient and carefully-selected evidence to form supportive arguments for predictions. Thanks to the superior generative capacity of large pretrained language models (PLM), recent work built on prompt engineering enables explanations generated without specific training. However, explanations generated through single-pass prompting often lack sufficiency and conciseness, due to the prompt complexity and hallucination issues. To discard the dross and take the essence of current PLM’s results, we propose to produce sufficient and concise explanations via the information bottleneck (EIB) theory. EIB regenerates explanations by polishing the single-pass output of PLM but retaining the information that supports the contents being explained by balancing two information bottleneck objectives. Experiments on two different tasks verify the effectiveness of EIB through automatic evaluation and thoroughly-conducted human evaluation.
Anthology ID:
2023.findings-acl.765
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12081–12102
Language:
URL:
https://aclanthology.org/2023.findings-acl.765
DOI:
10.18653/v1/2023.findings-acl.765
Bibkey:
Cite (ACL):
Qintong Li, Zhiyong Wu, Lingpeng Kong, and Wei Bi. 2023. Explanation Regeneration via Information Bottleneck. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12081–12102, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Explanation Regeneration via Information Bottleneck (Li et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.765.pdf
Video:
 https://aclanthology.org/2023.findings-acl.765.mp4