Disentangled Sequence to Sequence Learning for Compositional Generalization

Hao Zheng, Mirella Lapata


Abstract
There is mounting evidence that existing neural network models, in particular the very popular sequence-to-sequence architecture, struggle to systematically generalize to unseen compositions of seen components. We demonstrate that one of the reasons hindering compositional generalization relates to representations being entangled. We propose an extension to sequence-to-sequence models which encourage disentanglement by adaptively re-encoding (at each time step) the source input. Specifically, we condition the source representations on the newly decoded target context which makes it easier for the encoder to exploit specialized information for each prediction rather than capturing it all in a single forward pass. Experimental results on semantic parsing and machine translation empirically show that our proposal delivers more disentangled representations and better generalization.
Anthology ID:
2022.acl-long.293
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4256–4268
Language:
URL:
https://aclanthology.org/2022.acl-long.293
DOI:
10.18653/v1/2022.acl-long.293
Bibkey:
Cite (ACL):
Hao Zheng and Mirella Lapata. 2022. Disentangled Sequence to Sequence Learning for Compositional Generalization. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4256–4268, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Disentangled Sequence to Sequence Learning for Compositional Generalization (Zheng & Lapata, ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.293.pdf
Video:
 https://aclanthology.org/2022.acl-long.293.mp4
Code
 mswellhao/dangle
Data
CFQ