Generative Text Modeling through Short Run Inference

Bo Pang, Erik Nijkamp, Tian Han, Ying Nian Wu


Abstract
Latent variable models for text, when trained successfully, accurately model the data distribution and capture global semantic and syntactic features of sentences. The prominent approach to train such models is variational autoencoders (VAE). It is nevertheless challenging to train and often results in a trivial local optimum where the latent variable is ignored and its posterior collapses into the prior, an issue known as posterior collapse. Various techniques have been proposed to mitigate this issue. Most of them focus on improving the inference model to yield latent codes of higher quality. The present work proposes a short run dynamics for inference. It is initialized from the prior distribution of the latent variable and then runs a small number (e.g., 20) of Langevin dynamics steps guided by its posterior distribution. The major advantage of our method is that it does not require a separate inference model or assume simple geometry of the posterior distribution, thus rendering an automatic, natural and flexible inference engine. We show that the models trained with short run dynamics more accurately model the data, compared to strong language model and VAE baselines, and exhibit no sign of posterior collapse. Analyses of the latent space show that interpolation in the latent space is able to generate coherent sentences with smooth transition and demonstrate improved classification over strong baselines with latent features from unsupervised pretraining. These results together expose a well-structured latent space of our generative model.
Anthology ID:
2021.eacl-main.98
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1156–1165
Language:
URL:
https://aclanthology.org/2021.eacl-main.98
DOI:
10.18653/v1/2021.eacl-main.98
Bibkey:
Cite (ACL):
Bo Pang, Erik Nijkamp, Tian Han, and Ying Nian Wu. 2021. Generative Text Modeling through Short Run Inference. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 1156–1165, Online. Association for Computational Linguistics.
Cite (Informal):
Generative Text Modeling through Short Run Inference (Pang et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-main.98.pdf
Code
 bpucla/sri_text
Data
SNLI