Noisy Pairing and Partial Supervision for Stylized Opinion Summarization

Hayate Iso, Xiaolan Wang, Yoshi Suhara


Abstract
Opinion summarization research has primarily focused on generating summaries reflecting important opinions from customer reviews without paying much attention to the writing style. In this paper, we propose the stylized opinion summarization task, which aims to generate a summary of customer reviews in the desired (e.g., professional) writing style. To tackle the difficulty in collecting customer and professional review pairs, we develop a non-parallel training framework, Noisy Pairing and Partial Supervision (NAPA), which trains a stylized opinion summarization system from non-parallel customer and professional review sets. We create a benchmark ProSum by collecting customer and professional reviews from Yelp and Michelin. Experimental results on ProSum and FewSum demonstrate that our non-parallel training framework consistently improves both automatic and human evaluations, successfully building a stylized opinion summarization model that can generate professionally-written summaries from customer reviews. The code is available at https://github.com/megagonlabs/napa
Anthology ID:
2024.inlg-main.2
Volume:
Proceedings of the 17th International Natural Language Generation Conference
Month:
September
Year:
2024
Address:
Tokyo, Japan
Editors:
Saad Mahamood, Nguyen Le Minh, Daphne Ippolito
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
13–23
Language:
URL:
https://aclanthology.org/2024.inlg-main.2
DOI:
Bibkey:
Cite (ACL):
Hayate Iso, Xiaolan Wang, and Yoshi Suhara. 2024. Noisy Pairing and Partial Supervision for Stylized Opinion Summarization. In Proceedings of the 17th International Natural Language Generation Conference, pages 13–23, Tokyo, Japan. Association for Computational Linguistics.
Cite (Informal):
Noisy Pairing and Partial Supervision for Stylized Opinion Summarization (Iso et al., INLG 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.inlg-main.2.pdf
Supplementary attachment:
 2024.inlg-main.2.Supplementary_Attachment.pdf