Polarized-VAE: Proximity Based Disentangled Representation Learning for Text Generation

Vikash Balasubramanian, Ivan Kobyzev, Hareesh Bahuleyan, Ilya Shapiro, Olga Vechtomova


Abstract
Learning disentangled representations of realworld data is a challenging open problem. Most previous methods have focused on either supervised approaches which use attribute labels or unsupervised approaches that manipulate the factorization in the latent space of models such as the variational autoencoder (VAE) by training with task-specific losses. In this work, we propose polarized-VAE, an approach that disentangles select attributes in the latent space based on proximity measures reflecting the similarity between data points with respect to these attributes. We apply our method to disentangle the semantics and syntax of sentences and carry out transfer experiments. Polarized-VAE outperforms the VAE baseline and is competitive with state-of-the-art approaches, while being more a general framework that is applicable to other attribute disentanglement tasks.
Anthology ID:
2021.eacl-main.32
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
416–423
Language:
URL:
https://aclanthology.org/2021.eacl-main.32
DOI:
10.18653/v1/2021.eacl-main.32
Bibkey:
Cite (ACL):
Vikash Balasubramanian, Ivan Kobyzev, Hareesh Bahuleyan, Ilya Shapiro, and Olga Vechtomova. 2021. Polarized-VAE: Proximity Based Disentangled Representation Learning for Text Generation. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 416–423, Online. Association for Computational Linguistics.
Cite (Informal):
Polarized-VAE: Proximity Based Disentangled Representation Learning for Text Generation (Balasubramanian et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-main.32.pdf
Data
SNLI