SummaReranker: A Multi-Task Mixture-of-Experts Re-ranking Framework for Abstractive Summarization

Mathieu Ravaut, Shafiq Joty, Nancy Chen


Abstract
Sequence-to-sequence neural networks have recently achieved great success in abstractive summarization, especially through fine-tuning large pre-trained language models on the downstream dataset. These models are typically decoded with beam search to generate a unique summary. However, the search space is very large, and with the exposure bias, such decoding is not optimal. In this paper, we show that it is possible to directly train a second-stage model performing re-ranking on a set of summary candidates. Our mixture-of-experts SummaReranker learns to select a better candidate and consistently improves the performance of the base model. With a base PEGASUS, we push ROUGE scores by 5.44% on CNN- DailyMail (47.16 ROUGE-1), 1.31% on XSum (48.12 ROUGE-1) and 9.34% on Reddit TIFU (29.83 ROUGE-1), reaching a new state-of-the-art. Our code and checkpoints will be available at https://github.com/ntunlp/SummaReranker.
Anthology ID:
2022.acl-long.309
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4504–4524
Language:
URL:
https://aclanthology.org/2022.acl-long.309
DOI:
10.18653/v1/2022.acl-long.309
Bibkey:
Cite (ACL):
Mathieu Ravaut, Shafiq Joty, and Nancy Chen. 2022. SummaReranker: A Multi-Task Mixture-of-Experts Re-ranking Framework for Abstractive Summarization. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4504–4524, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
SummaReranker: A Multi-Task Mixture-of-Experts Re-ranking Framework for Abstractive Summarization (Ravaut et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.309.pdf
Software:
 2022.acl-long.309.software.zip
Video:
 https://aclanthology.org/2022.acl-long.309.mp4
Code
 ntunlp/summareranker
Data
CNN/Daily MailRedditReddit TIFUXSum