Improving Constituent Representation with Hypertree Neural Networks

Hao Zhou, Gongshen Liu, Kewei Tu


Abstract
Many natural language processing tasks involve text spans and thus high-quality span representations are needed to enhance neural approaches to these tasks. Most existing methods of span representation are based on simple derivations (such as max-pooling) from word representations and do not utilize compositional structures of natural language. In this paper, we aim to improve representations of constituent spans using a novel hypertree neural networks (HTNN) that is structured with constituency parse trees. Each node in the HTNN represents a constituent of the input sentence and each hyperedge represents a composition of smaller child constituents into a larger parent constituent. In each update iteration of the HTNN, the representation of each constituent is computed based on all the hyperedges connected to it, thus incorporating both bottom-up and top-down compositional information. We conduct comprehensive experiments to evaluate HTNNs against other span representation models and the results show the effectiveness of HTNN.
Anthology ID:
2022.naacl-main.121
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1682–1692
Language:
URL:
https://aclanthology.org/2022.naacl-main.121
DOI:
10.18653/v1/2022.naacl-main.121
Bibkey:
Cite (ACL):
Hao Zhou, Gongshen Liu, and Kewei Tu. 2022. Improving Constituent Representation with Hypertree Neural Networks. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 1682–1692, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Improving Constituent Representation with Hypertree Neural Networks (Zhou et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.121.pdf