Text Fact Transfer

Nishant Balepur, Jie Huang, Kevin Chang


Abstract
Text style transfer is a prominent task that aims to control the style of text without inherently changing its factual content. To cover more text modification applications, such as adapting past news for current events and repurposing educational materials, we propose the task of text fact transfer, which seeks to transfer the factual content of a source text between topics without modifying its style. We find that existing language models struggle with text fact transfer, due to their inability to preserve the specificity and phrasing of the source text, and tendency to hallucinate errors. To address these issues, we design ModQGA, a framework that minimally modifies a source text with a novel combination of end-to-end question generation and specificity-aware question answering. Through experiments on four existing datasets adapted for text fact transfer, we show that ModQGA can accurately transfer factual content without sacrificing the style of the source text.
Anthology ID:
2023.emnlp-main.288
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4745–4764
Language:
URL:
https://aclanthology.org/2023.emnlp-main.288
DOI:
10.18653/v1/2023.emnlp-main.288
Bibkey:
Cite (ACL):
Nishant Balepur, Jie Huang, and Kevin Chang. 2023. Text Fact Transfer. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 4745–4764, Singapore. Association for Computational Linguistics.
Cite (Informal):
Text Fact Transfer (Balepur et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.288.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.288.mp4