Simple and Effective Multi-Token Completion from Masked Language Models

Oren Kalinsky, Guy Kushilevitz, Alexander Libov, Yoav Goldberg


Abstract
Pre-trained neural masked language models are often used for predicting a replacement token for a given sequence position, in a cloze-like task. However, this usage is restricted to predicting a single token, from a relatively small pre-trained vocabulary. Recent Sequence2Sequence pre-trained LMs like T5 do allow predicting multi-token completions, but are more expensive to train and run. We show that pre-trained masked language models can be adapted to produce multi-token completions, with only a modest addition to their parameter count. We propose two simple adaptation approaches, trading parameter counts for accuracy. The first method generates multi-token completions from a conditioned RNN. It has a very low parameter count and achieves competitive results. The second method is even simpler: it adds items corresponding to multi-token units to the output prediction matrix. While being higher in parameter count than the RNN method, it also surpasses current state-of-the-art multi-token completion models, including T5-3B, while being significantly more parameter efficient. We demonstrate that our approach is flexible to different vocabularies and domains and can effectively leverage existing pre-trained models available in different domains. Finally, a human evaluation further validates our results and shows that our solution regularly provides valid completions, as well as reasonable correctness for factual-sentence completions.
Anthology ID:
2023.findings-eacl.179
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2356–2369
Language:
URL:
https://aclanthology.org/2023.findings-eacl.179
DOI:
10.18653/v1/2023.findings-eacl.179
Bibkey:
Cite (ACL):
Oren Kalinsky, Guy Kushilevitz, Alexander Libov, and Yoav Goldberg. 2023. Simple and Effective Multi-Token Completion from Masked Language Models. In Findings of the Association for Computational Linguistics: EACL 2023, pages 2356–2369, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Simple and Effective Multi-Token Completion from Masked Language Models (Kalinsky et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.179.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.179.mp4