When does Further Pre-training MLM Help? An Empirical Study on Task-Oriented Dialog Pre-training

Qi Zhu, Yuxian Gu, Lingxiao Luo, Bing Li, Cheng Li, Wei Peng, Minlie Huang, Xiaoyan Zhu


Abstract
Further pre-training language models on in-domain data (domain-adaptive pre-training, DAPT) or task-relevant data (task-adaptive pre-training, TAPT) before fine-tuning has been shown to improve downstream tasks’ performances. However, in task-oriented dialog modeling, we observe that further pre-training MLM does not always boost the performance on a downstream task. We find that DAPT is beneficial in the low-resource setting, but as the fine-tuning data size grows, DAPT becomes less beneficial or even useless, and scaling the size of DAPT data does not help. Through Representational Similarity Analysis, we conclude that more data for fine-tuning yields greater change of the model’s representations and thus reduces the influence of initialization.
Anthology ID:
2021.insights-1.9
Volume:
Proceedings of the Second Workshop on Insights from Negative Results in NLP
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
João Sedoc, Anna Rogers, Anna Rumshisky, Shabnam Tafreshi
Venue:
insights
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
54–61
Language:
URL:
https://aclanthology.org/2021.insights-1.9
DOI:
10.18653/v1/2021.insights-1.9
Bibkey:
Cite (ACL):
Qi Zhu, Yuxian Gu, Lingxiao Luo, Bing Li, Cheng Li, Wei Peng, Minlie Huang, and Xiaoyan Zhu. 2021. When does Further Pre-training MLM Help? An Empirical Study on Task-Oriented Dialog Pre-training. In Proceedings of the Second Workshop on Insights from Negative Results in NLP, pages 54–61, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
When does Further Pre-training MLM Help? An Empirical Study on Task-Oriented Dialog Pre-training (Zhu et al., insights 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.insights-1.9.pdf
Software:
 2021.insights-1.9.Software.zip
Video:
 https://aclanthology.org/2021.insights-1.9.mp4
Code
 zqwerty/toddapt