Unsupervised Selective Rationalization with Noise Injection

Adam Storek, Melanie Subbiah, Kathleen McKeown


Abstract
A major issue with using deep learning models in sensitive applications is that they provide no explanation for their output. To address this problem, unsupervised selective rationalization produces rationales alongside predictions by chaining two jointly-trained components, a rationale generator and a predictor. Although this architecture guarantees that the prediction relies solely on the rationale, it does not ensure that the rationale contains a plausible explanation for the prediction. We introduce a novel training technique that effectively limits generation of implausible rationales by injecting noise between the generator and the predictor. Furthermore, we propose a new benchmark for evaluating unsupervised selective rationalization models using movie reviews from existing datasets. We achieve sizeable improvements in rationale plausibility and task accuracy over the state-of-the-art across a variety of tasks, including our new benchmark, while maintaining or improving model faithfulness.
Anthology ID:
2023.acl-long.707
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12647–12659
Language:
URL:
https://aclanthology.org/2023.acl-long.707
DOI:
10.18653/v1/2023.acl-long.707
Bibkey:
Cite (ACL):
Adam Storek, Melanie Subbiah, and Kathleen McKeown. 2023. Unsupervised Selective Rationalization with Noise Injection. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 12647–12659, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Unsupervised Selective Rationalization with Noise Injection (Storek et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.707.pdf
Video:
 https://aclanthology.org/2023.acl-long.707.mp4