Dissecting Generation Modes for Abstractive Summarization Models via Ablation and Attribution

Jiacheng Xu, Greg Durrett


Abstract
Despite the prominence of neural abstractive summarization models, we know little about how they actually form summaries and how to understand where their decisions come from. We propose a two-step method to interpret summarization model decisions. We first analyze the model’s behavior by ablating the full model to categorize each decoder decision into one of several generation modes: roughly, is the model behaving like a language model, is it relying heavily on the input, or is it somewhere in between? After isolating decisions that do depend on the input, we explore interpreting these decisions using several different attribution methods. We compare these techniques based on their ability to select content and reconstruct the model’s predicted token from perturbations of the input, thus revealing whether highlighted attributions are truly important for the generation of the next token. While this machinery can be broadly useful even beyond summarization, we specifically demonstrate its capability to identify phrases the summarization model has memorized and determine where in the training pipeline this memorization happened, as well as study complex generation phenomena like sentence fusion on a per-instance basis.
Anthology ID:
2021.acl-long.539
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6925–6940
Language:
URL:
https://aclanthology.org/2021.acl-long.539
DOI:
10.18653/v1/2021.acl-long.539
Bibkey:
Cite (ACL):
Jiacheng Xu and Greg Durrett. 2021. Dissecting Generation Modes for Abstractive Summarization Models via Ablation and Attribution. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 6925–6940, Online. Association for Computational Linguistics.
Cite (Informal):
Dissecting Generation Modes for Abstractive Summarization Models via Ablation and Attribution (Xu & Durrett, ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.539.pdf
Video:
 https://aclanthology.org/2021.acl-long.539.mp4
Code
 jiacheng-xu/sum-interpret
Data
C4WebText