Disentangling Text Representation With Counter-Template For Unsupervised Opinion Summarization

Yanyue Zhang, Deyu Zhou


Abstract
Approaches for unsupervised opinion summarization are generally based on the reconstruction model and generate a summary by decoding the aggregated representation of inputs. Recent work has shown that aggregating via simple average leads to vector degeneration, generating the generic summary. To tackle the challenge, some approaches select the inputs before aggregating. However, we argue that the selection is too coarse as not all information in each input is equally essential for the summary. For example, the content information such as “great coffee maker, easy to set up” is more valuable than the pattern such as “this is a great product”. Therefore, we propose a novel framework for unsupervised opinion summarization based on text representation disentanglement with counter-template. In specific, a disentangling module is added to the encoder-decoder architecture which decouples the input text representation into two parts: content and pattern. To capture the pattern information, a counter-template is utilized as supervision, which is automatically generated based on contrastive learning. Experimental results on two benchmark datasets show that the proposed approach outperforms the state-of-the-art baselines on both quality and stability.
Anthology ID:
2023.findings-acl.395
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6344–6357
Language:
URL:
https://aclanthology.org/2023.findings-acl.395
DOI:
10.18653/v1/2023.findings-acl.395
Bibkey:
Cite (ACL):
Yanyue Zhang and Deyu Zhou. 2023. Disentangling Text Representation With Counter-Template For Unsupervised Opinion Summarization. In Findings of the Association for Computational Linguistics: ACL 2023, pages 6344–6357, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Disentangling Text Representation With Counter-Template For Unsupervised Opinion Summarization (Zhang & Zhou, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.395.pdf