LayerConnect: Hypernetwork-Assisted Inter-Layer Connector to Enhance Parameter Efficiency

Haoxiang Shi, Rongsheng Zhang, Jiaan Wang, Cen Wang, Yinhe Zheng, Tetsuya Sakai


Abstract
Pre-trained Language Models (PLMs) are the cornerstone of the modern Natural Language Processing (NLP). However, as PLMs become heavier, fine tuning all their parameters loses their efficiency. Existing parameter-efficient methods generally focus on reducing the trainable parameters in PLMs but neglect the inference speed, which limits the ability to deploy PLMs. In this paper, we propose LayerConnect (hypernetwork-assisted inter-layer connectors) to enhance inference efficiency. Specifically, a light-weight connector with a linear structure is inserted between two Transformer layers, and the parameters inside each connector are tuned by a hypernetwork comprising an interpolator and a down-sampler. We perform extensive experiments on the widely used the GLUE benchmark. The experimental results verify the inference efficiency of our model. Compared to Adapter, our model parameters are reduced to approximately 11.75%, while the performance degradation is kept to less than 5% (2.5 points on average).
Anthology ID:
2022.coling-1.276
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3120–3126
Language:
URL:
https://aclanthology.org/2022.coling-1.276
DOI:
Bibkey:
Cite (ACL):
Haoxiang Shi, Rongsheng Zhang, Jiaan Wang, Cen Wang, Yinhe Zheng, and Tetsuya Sakai. 2022. LayerConnect: Hypernetwork-Assisted Inter-Layer Connector to Enhance Parameter Efficiency. In Proceedings of the 29th International Conference on Computational Linguistics, pages 3120–3126, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
LayerConnect: Hypernetwork-Assisted Inter-Layer Connector to Enhance Parameter Efficiency (Shi et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.276.pdf
Data
GLUEQNLI