Boosting Summarization with Normalizing Flows and Aggressive Training

Yu Yang, Xiaotong Shen


Abstract
This paper presents FlowSUM, a normalizing flows-based variational encoder-decoder framework for Transformer-based summarization. Our approach tackles two primary challenges in variational summarization: insufficient semantic information in latent representations and posterior collapse during training. To address these challenges, we employ normalizing flows to enable flexible latent posterior modeling, and we propose a controlled alternate aggressive training (CAAT) strategy with an improved gate mechanism. Experimental results show that FlowSUM significantly enhances the quality of generated summaries and unleashes the potential for knowledge distillation with minimal impact on inference time. Furthermore, we investigate the issue of posterior collapse in normalizing flows and analyze how the summary quality is affected by the training strategy, gate initialization, and the type and number of normalizing flows used, offering valuable insights for future research.
Anthology ID:
2023.emnlp-main.165
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2727–2751
Language:
URL:
https://aclanthology.org/2023.emnlp-main.165
DOI:
10.18653/v1/2023.emnlp-main.165
Bibkey:
Cite (ACL):
Yu Yang and Xiaotong Shen. 2023. Boosting Summarization with Normalizing Flows and Aggressive Training. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 2727–2751, Singapore. Association for Computational Linguistics.
Cite (Informal):
Boosting Summarization with Normalizing Flows and Aggressive Training (Yang & Shen, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.165.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.165.mp4