Blank Language Models

Tianxiao Shen, Victor Quach, Regina Barzilay, Tommi Jaakkola


Abstract
We propose Blank Language Model (BLM), a model that generates sequences by dynamically creating and filling in blanks. The blanks control which part of the sequence to expand, making BLM ideal for a variety of text editing and rewriting tasks. The model can start from a single blank or partially completed text with blanks at specified locations. It iteratively determines which word to place in a blank and whether to insert new blanks, and stops generating when no blanks are left to fill. BLM can be efficiently trained using a lower bound of the marginal data likelihood. On the task of filling missing text snippets, BLM significantly outperforms all other baselines in terms of both accuracy and fluency. Experiments on style transfer and damaged ancient text restoration demonstrate the potential of this framework for a wide range of applications.
Anthology ID:
2020.emnlp-main.420
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5186–5198
Language:
URL:
https://aclanthology.org/2020.emnlp-main.420
DOI:
10.18653/v1/2020.emnlp-main.420
Bibkey:
Cite (ACL):
Tianxiao Shen, Victor Quach, Regina Barzilay, and Tommi Jaakkola. 2020. Blank Language Models. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 5186–5198, Online. Association for Computational Linguistics.
Cite (Informal):
Blank Language Models (Shen et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.420.pdf
Video:
 https://slideslive.com/38939329
Code
 Varal7/blank_language_model
Data
WikiText-103WikiText-2