Continual Sequence Generation with Adaptive Compositional Modules

Yanzhe Zhang, Xuezhi Wang, Diyi Yang


Abstract
Continual learning is essential for real-world deployment when there is a need to quickly adapt the model to new tasks without forgetting knowledge of old tasks. Existing work on continual sequence generation either always reuses existing parameters to learn new tasks, which is vulnerable to catastrophic forgetting on dissimilar tasks, or blindly adds new parameters for every new task, which could prevent knowledge sharing between similar tasks. To get the best of both worlds, in this work, we propose continual sequence generation with adaptive compositional modules to adaptively add modules in transformer architectures and compose both old and new modules for new tasks. We also incorporate pseudo experience replay to facilitate knowledge transfer in those shared modules. Experiment results on various sequences of generation tasks show that our framework can adaptively add modules or reuse modules based on task similarity, outperforming state-of-the-art baselines in terms of both performance and parameter efficiency. We make our code public at https://github.com/GT-SALT/Adaptive-Compositional-Modules.
Anthology ID:
2022.acl-long.255
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3653–3667
Language:
URL:
https://aclanthology.org/2022.acl-long.255
DOI:
10.18653/v1/2022.acl-long.255
Bibkey:
Cite (ACL):
Yanzhe Zhang, Xuezhi Wang, and Diyi Yang. 2022. Continual Sequence Generation with Adaptive Compositional Modules. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3653–3667, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Continual Sequence Generation with Adaptive Compositional Modules (Zhang et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.255.pdf
Software:
 2022.acl-long.255.software.zip
Video:
 https://aclanthology.org/2022.acl-long.255.mp4
Code
 GT-SALT/Adaptive-Compositional-Modules +  additional community code
Data
MultiWOZWikiSQL