A Deep Decomposable Model for Disentangling Syntax and Semantics in Sentence Representation

Dingcheng Li, Hongliang Fei, Shaogang Ren, Ping Li


Abstract
Recently, disentanglement based on a generative adversarial network or a variational autoencoder has significantly advanced the performance of diverse applications in CV and NLP domains. Nevertheless, those models still work on coarse levels in the disentanglement of closely related properties, such as syntax and semantics in human languages. This paper introduces a deep decomposable model based on VAE to disentangle syntax and semantics by using total correlation penalties on KL divergences. Notably, we decompose the KL divergence term of the original VAE so that the generated latent variables can be separated in a more clear-cut and interpretable way. Experiments on benchmark datasets show that our proposed model can significantly improve the disentanglement quality between syntactic and semantic representations for semantic similarity tasks and syntactic similarity tasks.
Anthology ID:
2021.findings-emnlp.364
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4300–4310
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.364
DOI:
10.18653/v1/2021.findings-emnlp.364
Bibkey:
Cite (ACL):
Dingcheng Li, Hongliang Fei, Shaogang Ren, and Ping Li. 2021. A Deep Decomposable Model for Disentangling Syntax and Semantics in Sentence Representation. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 4300–4310, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
A Deep Decomposable Model for Disentangling Syntax and Semantics in Sentence Representation (Li et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.364.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.364.mp4