Semantic-Preserving Abstractive Text Summarization with Siamese Generative Adversarial Net

Xin Sheng, Linli Xu, Yinlong Xu, Deqiang Jiang, Bo Ren


Abstract
We propose a novel siamese generative adversarial net for abstractive text summarization (SSPGAN), which can preserve the main semantics of the source text. Different from previous generative adversarial net based methods, SSPGAN is equipped with a siamese semantic-preserving discriminator, which can not only be trained to discriminate the machine-generated summaries from the human-summarized ones, but also ensure the semantic consistency between the source text and target summary. As a consequence of the min-max game between the generator and the siamese semantic-preserving discriminator, the generator can generate a summary that conveys the key content of the source text more accurately. Extensive experiments on several text summarization benchmarks in different languages demonstrate that the proposed model can achieve significant improvements over the state-of-the-art methods.
Anthology ID:
2022.findings-naacl.163
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2121–2132
Language:
URL:
https://aclanthology.org/2022.findings-naacl.163
DOI:
10.18653/v1/2022.findings-naacl.163
Bibkey:
Cite (ACL):
Xin Sheng, Linli Xu, Yinlong Xu, Deqiang Jiang, and Bo Ren. 2022. Semantic-Preserving Abstractive Text Summarization with Siamese Generative Adversarial Net. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 2121–2132, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Semantic-Preserving Abstractive Text Summarization with Siamese Generative Adversarial Net (Sheng et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.163.pdf