Think While You Write: Hypothesis Verification Promotes Faithful Knowledge-to-Text Generation

Yifu Qiu, Varun Embar, Shay Cohen, Benjamin Han


Abstract
Knowledge-to-text generators often struggle to faithfully generate descriptions for the input facts: they may produce hallucinations that contradict the input, or describe facts not present in the input. To reduce hallucinations, we propose a decoding-only method, TWEAK (Think While Effectively Articulating Knowledge), which can be integrated with any generator without retraining. TWEAK treats the generated sequences at each decoding step and its future sequences as hypotheses, and ranks each generation candidate based on the extent to which their hypotheses are supported by the input facts using a Hypothesis Verification Model (HVM). We first demonstrate the effectiveness of TWEAK by using a Natural Language Inference (NLI) model as the HVM and report improved faithfulness with a minimal impact on the quality. We then replace the NLI model with a task-specific HVM trained with a first-of-a-kind dataset, FATE (Fact-Aligned Textual Entailment), which pairs input facts with their original and perturbed descriptions. We test TWEAK with two generators, and the best TWEAK variants improve on average for the two models by 2.24/7.17 points in faithfulness (FactKB) in in/out-of-distribution evaluations, respectively, and with only a 0.14/0.32-point decline in quality (BERTScore).
Anthology ID:
2024.findings-naacl.106
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1628–1644
Language:
URL:
https://aclanthology.org/2024.findings-naacl.106
DOI:
Bibkey:
Cite (ACL):
Yifu Qiu, Varun Embar, Shay Cohen, and Benjamin Han. 2024. Think While You Write: Hypothesis Verification Promotes Faithful Knowledge-to-Text Generation. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 1628–1644, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Think While You Write: Hypothesis Verification Promotes Faithful Knowledge-to-Text Generation (Qiu et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-naacl.106.pdf
Copyright:
 2024.findings-naacl.106.copyright.pdf