Dense Paraphrasing for Textual Enrichment

Jingxuan Tu, Kyeongmin Rim, Eben Holderness, Bingyang Ye, James Pustejovsky


Abstract
Understanding inferences from text requires more than merely recovering surface arguments, adjuncts, or strings associated with the query terms. As humans, we interpret sentences as contextualized components of a narrative or discourse, by both filling in missing information, and reasoning about event consequences. In this paper, we define the process of rewriting a textual expression (lexeme or phrase) such that it reduces ambiguity while also making explicit the underlying semantics that is not (necessarily) expressed in the economy of sentence structure as Dense Paraphrasing (DP). We apply the DP techniques on the English procedural texts from the cooking recipe domain, and provide the scope and design of the application that involves creating a graph representation of events and generating hidden arguments through paraphrasing. We provide insights on how this DP process can enrich a source text by showing that the dense-paraphrased event graph is a good resource to large LLMs such as GPT-3 to generate reliable paraphrases; and by experimenting baselines for automaticDP generation. Finally, we demonstrate the utility of the dataset and event graph structure by providing a case study on the out-of-domain modeling and different DP prompts and GPT models for paraphrasing.
Anthology ID:
2023.iwcs-1.4
Volume:
Proceedings of the 15th International Conference on Computational Semantics
Month:
June
Year:
2023
Address:
Nancy, France
Editors:
Maxime Amblard, Ellen Breitholtz
Venue:
IWCS
SIG:
SIGSEM
Publisher:
Association for Computational Linguistics
Note:
Pages:
39–49
Language:
URL:
https://aclanthology.org/2023.iwcs-1.4
DOI:
Bibkey:
Cite (ACL):
Jingxuan Tu, Kyeongmin Rim, Eben Holderness, Bingyang Ye, and James Pustejovsky. 2023. Dense Paraphrasing for Textual Enrichment. In Proceedings of the 15th International Conference on Computational Semantics, pages 39–49, Nancy, France. Association for Computational Linguistics.
Cite (Informal):
Dense Paraphrasing for Textual Enrichment (Tu et al., IWCS 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.iwcs-1.4.pdf