Large-scale Lifelong Learning of In-context Instructions and How to Tackle It

Jisoo Mok, Jaeyoung Do, Sungjin Lee, Tara Taghavi, Seunghak Yu, Sungroh Yoon


Abstract
Jointly fine-tuning a Pre-trained Language Model (PLM) on a pre-defined set of tasks with in-context instructions has been proven to improve its generalization performance, allowing us to build a universal language model that can be deployed across task boundaries. In this work, we explore for the first time whether this attractive property of in-context instruction learning can be extended to a scenario in which tasks are fed to the target PLM in a sequential manner. The primary objective of so-called lifelong in-context instruction learning is to improve the target PLM’s instance- and task-level generalization performance as it observes more tasks. DynaInst, the proposed method to lifelong in-context instruction learning, achieves noticeable improvements in both types of generalization, nearly reaching the upper bound performance obtained through joint training.
Anthology ID:
2023.acl-long.703
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12573–12589
Language:
URL:
https://aclanthology.org/2023.acl-long.703
DOI:
10.18653/v1/2023.acl-long.703
Bibkey:
Cite (ACL):
Jisoo Mok, Jaeyoung Do, Sungjin Lee, Tara Taghavi, Seunghak Yu, and Sungroh Yoon. 2023. Large-scale Lifelong Learning of In-context Instructions and How to Tackle It. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 12573–12589, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Large-scale Lifelong Learning of In-context Instructions and How to Tackle It (Mok et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.703.pdf
Video:
 https://aclanthology.org/2023.acl-long.703.mp4