Probing Factually Grounded Content Transfer with Factual Ablation

Peter West, Chris Quirk, Michel Galley, Yejin Choi


Abstract
Despite recent success, large neural models often generate factually incorrect text. Compounding this is the lack of a standard automatic evaluation for factuality–it cannot be meaningfully improved if it cannot be measured. Grounded generation promises a path to solving both of these problems: models draw on a reliable external document (grounding) for factual information, simplifying the challenge of factuality. Measuring factuality is also simplified–to factual consistency, testing whether the generation agrees with the grounding, rather than all facts. Yet, without a standard automatic metric for factual consistency, factually grounded generation remains an open problem. We study this problem for content transfer, in which generations extend a prompt, using information from factual grounding. Particularly, this domain allows us to introduce the notion of factual ablation for automatically measuring factual consistency: this captures the intuition that the model should be less likely to produce an output given a less relevant grounding document. In practice, we measure this by presenting a model with two grounding documents, and the model should prefer to use the more factually relevant one. We contribute two evaluation sets to measure this. Applying our new evaluation, we propose multiple novel methods improving over strong baselines.
Anthology ID:
2022.findings-acl.294
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venues:
ACL | Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3732–3746
Language:
URL:
https://aclanthology.org/2022.findings-acl.294
DOI:
10.18653/v1/2022.findings-acl.294
Bibkey:
Cite (ACL):
Peter West, Chris Quirk, Michel Galley, and Yejin Choi. 2022. Probing Factually Grounded Content Transfer with Factual Ablation. In Findings of the Association for Computational Linguistics: ACL 2022, pages 3732–3746, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Probing Factually Grounded Content Transfer with Factual Ablation (West et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.294.pdf