Towards Unsupervised Compositional Entailment with Multi-Graph Embedding Models

Lorenzo Bertolini, Julie Weeds, David Weir


Abstract
Compositionality and inference are essential features of human language, and should hence be simultaneously accessible to a model of meaning. Despite being theory-grounded, distributional models can only be directly tested on compositionality, usually through similarity judgements, while testing for inference requires external resources. Recent work has shown that knowledge graph embeddings (KGE) architectures can be used to train distributional models capable of learning syntax-aware compositional representations, by training on syntactic graphs. We propose to expand such work with Multi-Graphs embedding (MuG) models, a new set of models learning from syntactic and knowledge-graphs. Using a phrase-level inference task, we show how MuGs can simultaneously handle syntax-aware composition and inference, and remain competitive distributional models with respect to lexical and compositional similarity.
Anthology ID:
2023.iwcs-1.5
Volume:
Proceedings of the 15th International Conference on Computational Semantics
Month:
June
Year:
2023
Address:
Nancy, France
Editors:
Maxime Amblard, Ellen Breitholtz
Venue:
IWCS
SIG:
SIGSEM
Publisher:
Association for Computational Linguistics
Note:
Pages:
50–61
Language:
URL:
https://aclanthology.org/2023.iwcs-1.5
DOI:
Bibkey:
Cite (ACL):
Lorenzo Bertolini, Julie Weeds, and David Weir. 2023. Towards Unsupervised Compositional Entailment with Multi-Graph Embedding Models. In Proceedings of the 15th International Conference on Computational Semantics, pages 50–61, Nancy, France. Association for Computational Linguistics.
Cite (Informal):
Towards Unsupervised Compositional Entailment with Multi-Graph Embedding Models (Bertolini et al., IWCS 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.iwcs-1.5.pdf