PartialFormer: Modeling Part Instead of Whole for Machine Translation

Tong Zheng, Bei Li, Huiwen Bao, Jiale Wang, Weiqiao Shan, Tong Xiao, JingBo Zhu


Abstract
The design choices in Transformer feed-forward neural networks have resulted in significant computational and parameter overhead. In this work, we emphasize the importance of hidden dimensions in designing lightweight FFNs, a factor often overlooked in previous architectures. Guided by this principle, we introduce PartialFormer, a parameter-efficient Transformer architecture utilizing multiple smaller FFNs to reduce parameters and computation while maintaining essential hidden dimensions. These smaller FFNs are integrated into a multi-head attention mechanism for effective collaboration. We also propose a tailored head scaling strategy to enhance PartialFormer’s capabilities. Furthermore, we present a residual-like attention calculation to improve depth scaling within PartialFormer. Extensive experiments on 9 translation tasks and 1 abstractive summarization task validate the effectiveness of our PartialFormer approach on machine translation and summarization tasks. Our code would be available at: https://github.com/zhengkid/PartialFormer.
Anthology ID:
2024.findings-acl.434
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7280–7294
Language:
URL:
https://aclanthology.org/2024.findings-acl.434
DOI:
Bibkey:
Cite (ACL):
Tong Zheng, Bei Li, Huiwen Bao, Jiale Wang, Weiqiao Shan, Tong Xiao, and JingBo Zhu. 2024. PartialFormer: Modeling Part Instead of Whole for Machine Translation. In Findings of the Association for Computational Linguistics ACL 2024, pages 7280–7294, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
PartialFormer: Modeling Part Instead of Whole for Machine Translation (Zheng et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.434.pdf