Bridging the Gap between Pre-Training and Fine-Tuning for Commonsense Generation

Haoran Yang, Yan Wang, Piji Li, Wei Bi, Wai Lam, Chen Xu


Abstract
Commonsense generation aims to generate a plausible sentence containing all given unordered concept words. Previous methods focusing on this task usually directly concatenate these words as the input of a pre-trained language model (PLM). However, in PLMs’ pre-training process, the inputs are often corrupted sentences with correct word order. This input distribution discrepancy between pre-training and fine-tuning makes the model difficult to fully utilize the knowledge of PLMs. In this paper, we propose a two-stage framework to alleviate this issue. Firstly, in pre-training stage, we design a new format of input to endow PLMs the ability to deal with masked sentences with incorrect word order. Secondly, during fine-tuning, we insert the special token [MASK] between two consecutive concept words to make the input distribution more similar to the input distribution in pre-training. We conduct extensive experiments and provide thorough analysis to demonstrate the effectiveness of our proposed method.
Anthology ID:
2023.findings-eacl.28
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
376–383
Language:
URL:
https://aclanthology.org/2023.findings-eacl.28
DOI:
10.18653/v1/2023.findings-eacl.28
Bibkey:
Cite (ACL):
Haoran Yang, Yan Wang, Piji Li, Wei Bi, Wai Lam, and Chen Xu. 2023. Bridging the Gap between Pre-Training and Fine-Tuning for Commonsense Generation. In Findings of the Association for Computational Linguistics: EACL 2023, pages 376–383, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Bridging the Gap between Pre-Training and Fine-Tuning for Commonsense Generation (Yang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.28.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.28.mp4