Fast and Accurate Factual Inconsistency Detection Over Long Documents

Barrett Lattimer, Patrick CHen, Xinyuan Zhang, Yi Yang


Abstract
Generative AI models exhibit remarkable potential; however, hallucinations across various tasks present a significant challenge, particularly for longer inputs that current approaches struggle to address effectively. We introduce SCALE (Source Chunking Approach for Large-scale inconsistency Evaluation), a task-agnostic model for detecting factual inconsistencies using a novel chunking strategy. Specifically, SCALE is a Natural Language Inference (NLI) based model that uses large text chunks to condition over long texts. This approach achieves state-of-the-art performance in factual inconsistency detection for diverse tasks and long inputs. Additionally, we leverage the chunking mechanism and employ a novel algorithm to explain SCALE’s decisions through relevant source sentence retrieval. Our evaluations reveal that SCALE outperforms existing methods on both standard benchmarks and a new long-form dialogue dataset ScreenEval we constructed. Moreover, SCALE surpasses competitive systems in efficiency and model explanation evaluations. We have released our code and data publicly to GitHub.
Anthology ID:
2023.emnlp-main.105
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1691–1703
Language:
URL:
https://aclanthology.org/2023.emnlp-main.105
DOI:
10.18653/v1/2023.emnlp-main.105
Bibkey:
Cite (ACL):
Barrett Lattimer, Patrick CHen, Xinyuan Zhang, and Yi Yang. 2023. Fast and Accurate Factual Inconsistency Detection Over Long Documents. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 1691–1703, Singapore. Association for Computational Linguistics.
Cite (Informal):
Fast and Accurate Factual Inconsistency Detection Over Long Documents (Lattimer et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.105.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.105.mp4