Implicit Premise Generation with Discourse-aware Commonsense Knowledge Models

Tuhin Chakrabarty, Aadit Trivedi, Smaranda Muresan


Abstract
Enthymemes are defined as arguments where a premise or conclusion is left implicit. We tackle the task of generating the implicit premise in an enthymeme, which requires not only an understanding of the stated conclusion and premise but also additional inferences that could depend on commonsense knowledge. The largest available dataset for enthymemes (Habernal et al., 2018) consists of 1.7k samples, which is not large enough to train a neural text generation model. To address this issue, we take advantage of a similar task and dataset: Abductive reasoning in narrative text (Bhagavatula et al., 2020). However, we show that simply using a state-of-the-art seq2seq model fine-tuned on this data might not generate meaningful implicit premises associated with the given enthymemes. We demonstrate that encoding discourse-aware commonsense during fine-tuning improves the quality of the generated implicit premises and outperforms all other baselines both in automatic and human evaluations on three different datasets.
Anthology ID:
2021.emnlp-main.504
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6247–6252
Language:
URL:
https://aclanthology.org/2021.emnlp-main.504
DOI:
10.18653/v1/2021.emnlp-main.504
Bibkey:
Cite (ACL):
Tuhin Chakrabarty, Aadit Trivedi, and Smaranda Muresan. 2021. Implicit Premise Generation with Discourse-aware Commonsense Knowledge Models. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 6247–6252, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Implicit Premise Generation with Discourse-aware Commonsense Knowledge Models (Chakrabarty et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.504.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.504.mp4
Code
 tuhinjubcse/enthymemesemnlp2021
Data
ART Dataset