Why LLMs Hallucinate, and How to Get (Evidential) Closure: Perceptual, Intensional, and Extensional Learning for Faithful Natural Language Generation

Adam Bouyamourn


Abstract
We show that LLMs hallucinate because their output is not constrained to be synonymous with claims for which they have evidence: a condition that we call evidential closure. Information about the truth or falsity of sentences is not statistically identified in the standard neural language generation setup, and so cannot be conditioned on to generate new strings. We then show how to constrain LLMs to produce output that satisfies evidential closure. A multimodal LLM must learn about the external world (perceptual learning); it must learn a mapping from strings to states of the world (extensional learning); and, to achieve fluency when generalizing beyond a body of evidence, it must learn mappings from strings to their synonyms (intensional learning). The output of a unimodal LLM must be synonymous with strings in a validated evidence set. Finally, we present a heuristic procedure, Learn-Babble-Prune, that yields faithful output from an LLM by rejecting output that is not synonymous with claims for which the LLM has evidence.
Anthology ID:
2023.emnlp-main.192
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3181–3193
Language:
URL:
https://aclanthology.org/2023.emnlp-main.192
DOI:
10.18653/v1/2023.emnlp-main.192
Bibkey:
Cite (ACL):
Adam Bouyamourn. 2023. Why LLMs Hallucinate, and How to Get (Evidential) Closure: Perceptual, Intensional, and Extensional Learning for Faithful Natural Language Generation. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 3181–3193, Singapore. Association for Computational Linguistics.
Cite (Informal):
Why LLMs Hallucinate, and How to Get (Evidential) Closure: Perceptual, Intensional, and Extensional Learning for Faithful Natural Language Generation (Bouyamourn, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.192.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.192.mp4