Rationale-based Opinion Summarization

Haoyuan Li, Snigdha Chaturvedi


Abstract
Opinion summarization aims to generate concise summaries that present popular opinions of a large group of reviews. However, these summaries can be too generic and lack supporting details. To address these issues, we propose a new paradigm for summarizing reviews, rationale-based opinion summarization. Rationale-based opinion summaries output the representative opinions as well as one or more corresponding rationales. To extract good rationales, we define four desirable properties: relatedness, specificity, popularity, and diversity and present a Gibbs-sampling-based method to extract rationales. Overall, we propose RATION, an unsupervised extractive system that has two components: an Opinion Extractor (to extract representative opinions) and Rationales Extractor (to extract corresponding rationales). We conduct automatic and human evaluations to show that rationales extracted by RATION have the proposed properties and its summaries are more useful than conventional summaries. The implementation of our work is available at https://github.com/leehaoyuan/RATION.
Anthology ID:
2024.naacl-long.458
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8267–8285
Language:
URL:
https://aclanthology.org/2024.naacl-long.458
DOI:
Bibkey:
Cite (ACL):
Haoyuan Li and Snigdha Chaturvedi. 2024. Rationale-based Opinion Summarization. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 8267–8285, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Rationale-based Opinion Summarization (Li & Chaturvedi, NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.458.pdf
Copyright:
 2024.naacl-long.458.copyright.pdf