CoTAR: Chain-of-Thought Attribution Reasoning with Multi-level Granularity

Moshe Berchansky, Daniel Fleischer, Moshe Wasserblat, Peter Izsak


Abstract
State-of-the-art performance in QA tasks is currently achieved by systems employing Large Language Models (LLMs), however these models tend to hallucinate information in their responses. One approach focuses on enhancing the generation process by incorporating attribution from the given input to the output. However, the challenge of identifying appropriate attributions and verifying their accuracy against a source is a complex task that requires significant improvements in assessing such systems. We introduce an attribution-oriented Chain-of-Thought reasoning method to enhance the accuracy of attributions. This approach focuses the reasoning process on generating an attribution-centric output. Evaluations on two context enhanced question-answering datasets using GPT-4 demonstrate improved accuracy and correctness of attributions. In addition, the combination of our method with finetuning enhances the response and attribution accuracy of two smaller LLMs, showing their potential to outperform GPT-4 in some cases.
Anthology ID:
2024.findings-emnlp.13
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
236–246
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.13
DOI:
Bibkey:
Cite (ACL):
Moshe Berchansky, Daniel Fleischer, Moshe Wasserblat, and Peter Izsak. 2024. CoTAR: Chain-of-Thought Attribution Reasoning with Multi-level Granularity. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 236–246, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
CoTAR: Chain-of-Thought Attribution Reasoning with Multi-level Granularity (Berchansky et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.13.pdf