Continuous Language Generative Flow

Zineng Tang, Shiyue Zhang, Hyounghun Kim, Mohit Bansal


Abstract
Recent years have witnessed various types of generative models for natural language generation (NLG), especially RNNs or transformer based sequence-to-sequence models, as well as variational autoencoder (VAE) and generative adversarial network (GAN) based models. However, flow-based generative models, which achieve strong performance in image generation due to their invertibility and exact density estimation properties, have been less explored for NLG. In this paper, we propose a flow-based language generation model by adapting previous flow generative models to language generation via continuous input embeddings, adapted affine coupling structures, and a novel architecture for autoregressive text generation. We also apply our framework to Sequence-to-Sequence generation, including text- and video-based Question Generation (QG) and Neural Machine Translation (NMT), and data augmentation for Question Answering (QA). We use our language flow model to provide extra input features for QG and NMT, which achieves improvements over the strong QG baselines on SQuAD and TVQA and NMT baseline on WMT16. We also augment QA data with new context by injecting noise to the latent features of the language flow and show this augmentation leads to a large performance improvement from strong baselines on SQuAD and TVQA.
Anthology ID:
2021.acl-long.355
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4609–4622
Language:
URL:
https://aclanthology.org/2021.acl-long.355
DOI:
10.18653/v1/2021.acl-long.355
Bibkey:
Cite (ACL):
Zineng Tang, Shiyue Zhang, Hyounghun Kim, and Mohit Bansal. 2021. Continuous Language Generative Flow. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 4609–4622, Online. Association for Computational Linguistics.
Cite (Informal):
Continuous Language Generative Flow (Tang et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.355.pdf
Video:
 https://aclanthology.org/2021.acl-long.355.mp4
Code
 zinengtang/continuousflownlg
Data
TVQATVQA+