Rationales for Sequential Predictions

Keyon Vafa, Yuntian Deng, David Blei, Alexander Rush


Abstract
Sequence models are a critical component of modern NLP systems, but their predictions are difficult to explain. We consider model explanations though rationales, subsets of context that can explain individual model predictions. We find sequential rationales by solving a combinatorial optimization: the best rationale is the smallest subset of input tokens that would predict the same output as the full sequence. Enumerating all subsets is intractable, so we propose an efficient greedy algorithm to approximate this objective. The algorithm, which is called greedy rationalization, applies to any model. For this approach to be effective, the model should form compatible conditional distributions when making predictions on incomplete subsets of the context. This condition can be enforced with a short fine-tuning step. We study greedy rationalization on language modeling and machine translation. Compared to existing baselines, greedy rationalization is best at optimizing the sequential objective and provides the most faithful rationales. On a new dataset of annotated sequential rationales, greedy rationales are most similar to human rationales.
Anthology ID:
2021.emnlp-main.807
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10314–10332
Language:
URL:
https://aclanthology.org/2021.emnlp-main.807
DOI:
10.18653/v1/2021.emnlp-main.807
Bibkey:
Cite (ACL):
Keyon Vafa, Yuntian Deng, David Blei, and Alexander Rush. 2021. Rationales for Sequential Predictions. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 10314–10332, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Rationales for Sequential Predictions (Vafa et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.807.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.807.mp4
Code
 keyonvafa/sequential-rationales +  additional community code
Data
LAMBADA