%0 Conference Proceedings %T Improving Text Auto-Completion with Next Phrase Prediction %A Lee, Dong-Ho %A Hu, Zhiqiang %A Lee, Roy Ka-Wei %Y Moens, Marie-Francine %Y Huang, Xuanjing %Y Specia, Lucia %Y Yih, Scott Wen-tau %S Findings of the Association for Computational Linguistics: EMNLP 2021 %D 2021 %8 November %I Association for Computational Linguistics %C Punta Cana, Dominican Republic %F lee-etal-2021-improving-text-auto %X Language models such as GPT-2 have performed well on constructing syntactically sound sentences for text auto-completion tasks. However, such models often require considerable training effort to adapt to specific writing domains (e.g., medical). In this paper, we propose an intermediate training strategy to enhance pre-trained language models’ performance in the text auto-completion task and fastly adapt them to specific domains. Our strategy includes a novel self-supervised training objective called Next Phrase Prediction (NPP), which encourages a language model to complete the partial query with enriched phrases and eventually improve the model’s text auto-completion performance. Preliminary experiments have shown that our approach is able to outperform the baselines in auto-completion for email and academic-writing domains. %R 10.18653/v1/2021.findings-emnlp.378 %U https://aclanthology.org/2021.findings-emnlp.378 %U https://doi.org/10.18653/v1/2021.findings-emnlp.378 %P 4434-4438