Investigating Efficiently Extending Transformers for Long Input Summarization

Jason Phang, Yao Zhao, Peter Liu


Abstract
While large pretrained Transformer models have proven highly capable at tackling natural language tasks, handling long sequence inputs still poses a significant challenge. One such task is long input summarization, where inputs are longer than the maximum input context of most models. Through an extensive set of experiments, we investigate what model architectural changes and pretraining paradigms most efficiently adapt a pretrained Transformer for long input summarization. We find that a staggered, block-local Transformer with global encoder tokens strikes a good balance of performance and efficiency, and that an additional pretraining phase on long sequences meaningfully improves downstream summarization performance. Based on our findings, we introduce PEGASUS-X, an extension of the PEGASUS model with additional long input pretraining to handle inputs of up to 16K tokens, which achieves strong performance on long input summarization tasks comparable with much larger models.
Anthology ID:
2023.emnlp-main.240
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3946–3961
Language:
URL:
https://aclanthology.org/2023.emnlp-main.240
DOI:
10.18653/v1/2023.emnlp-main.240
Bibkey:
Cite (ACL):
Jason Phang, Yao Zhao, and Peter Liu. 2023. Investigating Efficiently Extending Transformers for Long Input Summarization. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 3946–3961, Singapore. Association for Computational Linguistics.
Cite (Informal):
Investigating Efficiently Extending Transformers for Long Input Summarization (Phang et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.240.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.240.mp4