The Effect of Scaling, Retrieval Augmentation and Form on the Factual Consistency of Language Models

Lovisa Hagström, Denitsa Saynova, Tobias Norlund, Moa Johansson, Richard Johansson


Abstract
Large Language Models (LLMs) make natural interfaces to factual knowledge, but their usefulness is limited by their tendency to deliver inconsistent answers to semantically equivalent questions. For example, a model might supply the answer “Edinburgh” to “Anne Redpath passed away in X.” and “London” to “Anne Redpath’s life ended in X.” In this work, we identify potential causes of inconsistency and evaluate the effectiveness of two mitigation strategies: up-scaling and augmenting the LM with a passage retrieval database. Our results on the LLaMA and Atlas models show that both strategies reduce inconsistency but that retrieval augmentation is considerably more efficient. We further consider and disentangle the consistency contributions of different components of Atlas. For all LMs evaluated we find that syntactical form and task artifacts impact consistency. Taken together, our results provide a better understanding of the factors affecting the factual consistency of language models.
Anthology ID:
2023.emnlp-main.332
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5457–5476
Language:
URL:
https://aclanthology.org/2023.emnlp-main.332
DOI:
10.18653/v1/2023.emnlp-main.332
Bibkey:
Cite (ACL):
Lovisa Hagström, Denitsa Saynova, Tobias Norlund, Moa Johansson, and Richard Johansson. 2023. The Effect of Scaling, Retrieval Augmentation and Form on the Factual Consistency of Language Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 5457–5476, Singapore. Association for Computational Linguistics.
Cite (Informal):
The Effect of Scaling, Retrieval Augmentation and Form on the Factual Consistency of Language Models (Hagström et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.332.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.332.mp4