Learning to Generate Answers with Citations via Factual Consistency Models

Rami Aly, Zhiqiang Tang, Samson Tan, George Karypis


Abstract
Large Language Models (LLMs) frequently hallucinate, impeding their reliability in mission-critical situations. One approach to address this issue is to provide citations to relevant sources alongside generated content, enhancing the verifiability of generations. However, citing passages accurately in answers remains a substantial challenge. This paper proposes a weakly-supervised fine-tuning method leveraging factual consistency models (FCMs). Our approach alternates between generating texts with citations and supervised fine-tuning with FCM-filtered citation data. Focused learning is integrated into the objective, directing the fine-tuning process to emphasise the factual unit tokens, as measured by an FCM. Results on the ALCE few-shot citation benchmark with various instruction-tuned LLMs demonstrate superior performance compared to in-context learning, vanilla supervised fine-tuning, and state-of-the-art methods, with an average improvement of 34.1, 15.5, and 10.5 citation F1 points, respectively. Moreover, in a domain transfer setting we show that the obtained citation generation ability robustly transfers to unseen datasets. Notably, our citation improvements contribute to the lowest factual error rate across baselines.
Anthology ID:
2024.acl-long.641
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11876–11896
Language:
URL:
https://aclanthology.org/2024.acl-long.641
DOI:
Bibkey:
Cite (ACL):
Rami Aly, Zhiqiang Tang, Samson Tan, and George Karypis. 2024. Learning to Generate Answers with Citations via Factual Consistency Models. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 11876–11896, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Learning to Generate Answers with Citations via Factual Consistency Models (Aly et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.641.pdf