Autoencoding Pixies: Amortised Variational Inference with Graph Convolutions for Functional Distributional Semantics

Guy Emerson


Abstract
Functional Distributional Semantics provides a linguistically interpretable framework for distributional semantics, by representing the meaning of a word as a function (a binary classifier), instead of a vector. However, the large number of latent variables means that inference is computationally expensive, and training a model is therefore slow to converge. In this paper, I introduce the Pixie Autoencoder, which augments the generative model of Functional Distributional Semantics with a graph-convolutional neural network to perform amortised variational inference. This allows the model to be trained more effectively, achieving better results on two tasks (semantic similarity in context and semantic composition), and outperforming BERT, a large pre-trained language model.
Anthology ID:
2020.acl-main.367
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3982–3995
Language:
URL:
https://aclanthology.org/2020.acl-main.367
DOI:
10.18653/v1/2020.acl-main.367
Bibkey:
Cite (ACL):
Guy Emerson. 2020. Autoencoding Pixies: Amortised Variational Inference with Graph Convolutions for Functional Distributional Semantics. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 3982–3995, Online. Association for Computational Linguistics.
Cite (Informal):
Autoencoding Pixies: Amortised Variational Inference with Graph Convolutions for Functional Distributional Semantics (Emerson, ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.367.pdf
Software:
 2020.acl-main.367.Software.zip
Video:
 http://slideslive.com/38929060
Code
 guyemerson/pixie