Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers

Machel Reid, Edison Marrese-Taylor, Yutaka Matsuo


Abstract
Transformers have shown improved performance when compared to previous architectures for sequence processing such as RNNs. Despite their sizeable performance gains, as recently suggested, the model is computationally expensive to train and with a high parameter budget. In light of this, we explore parameter-sharing methods in Transformers with a specific focus on generative models. We perform an analysis of different parameter sharing/reduction methods and develop the Subformer. Our model combines sandwich-style parameter sharing, which overcomes naive cross-layer parameter sharing in generative models, and self-attentive embedding factorization (SAFE). Experiments on machine translation, abstractive summarization and language modeling show that the Subformer can outperform the Transformer even when using significantly fewer parameters.
Anthology ID:
2021.findings-emnlp.344
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4081–4090
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.344
DOI:
10.18653/v1/2021.findings-emnlp.344
Bibkey:
Cite (ACL):
Machel Reid, Edison Marrese-Taylor, and Yutaka Matsuo. 2021. Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 4081–4090, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers (Reid et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.344.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.344.mp4
Code
 machelreid/subformer
Data
CNN/Daily MailWikiText-103WikiText-2