NLI to the Rescue: Mapping Entailment Classes to Hallucination Categories in Abstractive Summarization

Badathala Naveen, Saxena Ashita, Bhattacharyya Pushpak


Abstract
In this paper, we detect hallucinations in summaries generated by abstractive summarization models. We focus on three types of hallucination viz. intrinsic, extrinsic, and nonhallucinated. The method used for detecting hallucination is based on textual entailment. Given a premise and a hypothesis, textual entailment classifies the hypothesis as contradiction, neutral, or entailment. These three classes of textual entailment are mapped to intrinsic, extrinsic, and non-hallucinated respectively. We fine-tune a RoBERTa-large model on NLI datasets and use it to detect hallucinations on the XSumFaith dataset. We demonstrate that our simple approach using textual entailment outperforms the existing factuality inconsistency detection systems by 12% and we provide insightful analysis of all types of hallucination. To advance research in this area, we create and release a dataset, XSumFaith++, which contains balanced instances of hallucinated and non-hallucinated summaries.
Anthology ID:
2023.icon-1.12
Volume:
Proceedings of the 20th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2023
Address:
Goa University, Goa, India
Editors:
D. Pawar Jyoti, Lalitha Devi Sobha
Venue:
ICON
SIG:
SIGLEX
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
120–132
Language:
URL:
https://aclanthology.org/2023.icon-1.12
DOI:
Bibkey:
Cite (ACL):
Badathala Naveen, Saxena Ashita, and Bhattacharyya Pushpak. 2023. NLI to the Rescue: Mapping Entailment Classes to Hallucination Categories in Abstractive Summarization. In Proceedings of the 20th International Conference on Natural Language Processing (ICON), pages 120–132, Goa University, Goa, India. NLP Association of India (NLPAI).
Cite (Informal):
NLI to the Rescue: Mapping Entailment Classes to Hallucination Categories in Abstractive Summarization (Naveen et al., ICON 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.icon-1.12.pdf