Attributable and Scalable Opinion Summarization

Tom Hosking, Hao Tang, Mirella Lapata


Abstract
We propose a method for unsupervised opinion summarization that encodes sentences from customer reviews into a hierarchical discrete latent space, then identifies common opinions based on the frequency of their encodings. We are able to generate both abstractive summaries by decoding these frequent encodings, and extractive summaries by selecting the sentences assigned to the same frequent encodings. Our method is attributable, because the model identifies sentences used to generate the summary as part of the summarization process. It scales easily to many hundreds of input reviews, because aggregation is performed in the latent space rather than over long sequences of tokens. We also demonstrate that our appraoch enables a degree of control, generating aspect-specific summaries by restricting the model to parts of the encoding space that correspond to desired aspects (e.g., location or food). Automatic and human evaluation on two datasets from different domains demonstrates that our method generates summaries that are more informative than prior work and better grounded in the input reviews.
Anthology ID:
2023.acl-long.473
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8488–8505
Language:
URL:
https://aclanthology.org/2023.acl-long.473
DOI:
10.18653/v1/2023.acl-long.473
Bibkey:
Cite (ACL):
Tom Hosking, Hao Tang, and Mirella Lapata. 2023. Attributable and Scalable Opinion Summarization. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 8488–8505, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Attributable and Scalable Opinion Summarization (Hosking et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.473.pdf
Video:
 https://aclanthology.org/2023.acl-long.473.mp4