Hierarchical Quantized Representations for Script Generation

Noah Weber, Leena Shekhar, Niranjan Balasubramanian, Nathanael Chambers


Abstract
Scripts define knowledge about how everyday scenarios (such as going to a restaurant) are expected to unfold. One of the challenges to learning scripts is the hierarchical nature of the knowledge. For example, a suspect arrested might plead innocent or guilty, and a very different track of events is then expected to happen. To capture this type of information, we propose an autoencoder model with a latent space defined by a hierarchy of categorical variables. We utilize a recently proposed vector quantization based approach, which allows continuous embeddings to be associated with each latent variable value. This permits the decoder to softly decide what portions of the latent hierarchy to condition on by attending over the value embeddings for a given setting. Our model effectively encodes and generates scripts, outperforming a recent language modeling-based method on several standard tasks, and allowing the autoencoder model to achieve substantially lower perplexity scores compared to the previous language modeling-based method.
Anthology ID:
D18-1413
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3783–3792
Language:
URL:
https://aclanthology.org/D18-1413
DOI:
10.18653/v1/D18-1413
Bibkey:
Cite (ACL):
Noah Weber, Leena Shekhar, Niranjan Balasubramanian, and Nathanael Chambers. 2018. Hierarchical Quantized Representations for Script Generation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 3783–3792, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Hierarchical Quantized Representations for Script Generation (Weber et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1413.pdf
Video:
 https://aclanthology.org/D18-1413.mp4
Code
 StonyBrookNLP/HAQAE