Mehrnoosh Sadrzadeh

Also published as: M. Sadrzadeh


2024

pdf bib
VerbCLIP: Improving Verb Understanding in Vision-Language Models with Compositional Structures
Hadi Wazni | Kin Ian Lo | Mehrnoosh Sadrzadeh
Proceedings of the 3rd Workshop on Advances in Language and Vision Research (ALVR)

Verbs describe the dynamics of interactions between people, objects, and their environments. They play a crucial role in language formation and understanding. Nonetheless, recent vision-language models like CLIP predominantly rely on nouns and have a limited account of verbs. This limitation affects their performance in tasks requiring action recognition and scene understanding. In this work, we introduce VerbCLIP, a verb-centric vision-language model which learns meanings of verbs based on a compositional approach to statistical machine learning. Our methods significantly outperform CLIP in zero-shot performance on the VALSE, VL-Checklist, and SVO-Probes datasets, with improvements of +2.38%, +3.14%, and +1.47%, without fine-tuning. Fine-tuning resulted in further improvements, with gains of +2.85% and +9.2% on the VALSE and VL-Checklist datasets.

pdf bib
How Does an Adjective Sound Like? Exploring Audio Phrase Composition with Textual Embeddings
Saba Nazir | Mehrnoosh Sadrzadeh
Proceedings of the 2024 CLASP Conference on Multimodality and Interaction in Language Learning

We learn matrix representations for the fre- quent sound-relevant adjectives of English and compose them with vector representations of their nouns. The matrices are learnt jointly from audio and textual data, via linear regres- sion and tensor skipgram. They are assessed using an adjective similarity benchmark and also a novel adjective-noun phrase similarity dataset, applied to two tasks: semantic similar- ity and audio similarity. Joint learning via Ten- sor Skipgram (TSG) outperforms audio-only models, matrix composition outperforms addi- tion and non compositional phrase vectors.

pdf bib
How can large language models become more human?
Daphne Wang | Mehrnoosh Sadrzadeh | Miloš Stanojević | Wing-Yee Chow | Richard Breheny
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics

Psycholinguistic experiments reveal that efficiency of human language use is founded on predictions at both syntactic and lexical levels. Previous models of human prediction exploiting LLMs have used an information theoretic measure called surprisal, with success on naturalistic text in a wide variety of languages, but under-performance on challenging text such as garden path sentences. This paper introduces a novel framework that combines the lexical predictions of an LLM with the syntactic structures provided by a dependency parser. The framework gives rise to an Incompatibility Fraction. When tested on two garden path datasets, it correlated well with human reading times, distinguished between easy and hard garden path, and outperformed surprisal.

2023

pdf bib
Towards Transparency in Coreference Resolution: A Quantum-Inspired Approach
Hadi Wazni | Mehrnoosh Sadrzadeh
Proceedings of The Sixth Workshop on Computational Models of Reference, Anaphora and Coreference (CRAC 2023)

2021

pdf bib
Proceedings of the 2021 Workshop on Semantic Spaces at the Intersection of NLP, Physics, and Cognitive Science (SemSpace)
Martha Lewis | Mehrnoosh Sadrzadeh
Proceedings of the 2021 Workshop on Semantic Spaces at the Intersection of NLP, Physics, and Cognitive Science (SemSpace)

pdf bib
On the Quantum-like Contextuality of Ambiguous Phrases
Daphne Wang | Mehrnoosh Sadrzadeh | Samson Abramsky | Victor Cervantes
Proceedings of the 2021 Workshop on Semantic Spaces at the Intersection of NLP, Physics, and Cognitive Science (SemSpace)

Language is contextual as meanings of words are dependent on their contexts. Contextuality is, concomitantly, a well-defined concept in quantum mechanics where it is considered a major resource for quantum computations. We investigate whether natural language exhibits any of the quantum mechanics’ contextual features. We show that meaning combinations in ambiguous phrases can be modelled in the sheaf-theoretic framework for quantum contextuality, where they can become possibilistically contextual. Using the framework of Contextuality-by-Default (CbD), we explore the probabilistic variants of these and show that CbD-contextuality is also possible.

2020

pdf bib
Representation Learning for Type-Driven Composition
Gijs Wijnholds | Mehrnoosh Sadrzadeh | Stephen Clark
Proceedings of the 24th Conference on Computational Natural Language Learning

This paper is about learning word representations using grammatical type information. We use the syntactic types of Combinatory Categorial Grammar to develop multilinear representations, i.e. maps with n arguments, for words with different functional types. The multilinear maps of words compose with each other to form sentence representations. We extend the skipgram algorithm from vectors to multi- linear maps to learn these representations and instantiate it on unary and binary maps for transitive verbs. These are evaluated on verb and sentence similarity and disambiguation tasks and a subset of the SICK relatedness dataset. Our model performs better than previous type- driven models and is competitive with state of the art representation learning methods such as BERT and neural sentence encoders.

pdf bib
A toy distributional model for fuzzy generalised quantifiers
Mehrnoosh Sadrzadeh | Gijs Wijnholds
Proceedings of the Probability and Meaning Conference (PaM 2020)

Recent work in compositional distributional semantics showed how bialgebras model generalised quantifiers of natural language. That technique requires working with vector space over power sets of bases, and therefore is computationally costly. It is possible to overcome the computational hurdles by working with fuzzy generalised quantifiers. In this paper, we show that the compositional notion of semantics of natural language, guided by a grammar, extends from a binary to a many valued setting and instantiate in it the fuzzy computations. We import vector representations of words and predicates, learnt from large scale compositional distributional semantics, interpret them as fuzzy sets, and analyse their performance on a toy inference dataset.

2019

pdf bib
Evaluating Composition Models for Verb Phrase Elliptical Sentence Embeddings
Gijs Wijnholds | Mehrnoosh Sadrzadeh
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)

Ellipsis is a natural language phenomenon where part of a sentence is missing and its information must be recovered from its surrounding context, as in “Cats chase dogs and so do foxes.”. Formal semantics has different methods for resolving ellipsis and recovering the missing information, but the problem has not been considered for distributional semantics, where words have vector embeddings and combinations thereof provide embeddings for sentences. In elliptical sentences these combinations go beyond linear as copying of elided information is necessary. In this paper, we develop different models for embedding VP-elliptical sentences. We extend existing verb disambiguation and sentence similarity datasets to ones containing elliptical phrases and evaluate our models on these datasets for a variety of non-linear combinations and their linear counterparts. We compare results of these compositional models to state of the art holistic sentence encoders. Our results show that non-linear addition and a non-linear tensor-based composition outperform the naive non-compositional baselines and the linear models, and that sentence encoders perform well on sentence similarity, but not on verb disambiguation.

pdf bib
Proceedings of the IWCS Workshop Vector Semantics for Discourse and Dialogue
Mehrnoosh Sadrzadeh | Matthew Purver | Arash Eshghi | Julian Hough | Ruth Kempson | Patrick G. T. Healey
Proceedings of the IWCS Workshop Vector Semantics for Discourse and Dialogue

2017

pdf bib
Proceedings of the 15th Meeting on the Mathematics of Language
Makoto Kanazawa | Philippe de Groote | Mehrnoosh Sadrzadeh
Proceedings of the 15th Meeting on the Mathematics of Language

2016

pdf bib
Distributional Inclusion Hypothesis for Tensor-based Composition
Dimitri Kartsaklis | Mehrnoosh Sadrzadeh
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers

According to the distributional inclusion hypothesis, entailment between words can be measured via the feature inclusions of their distributional vectors. In recent work, we showed how this hypothesis can be extended from words to phrases and sentences in the setting of compositional distributional semantics. This paper focuses on inclusion properties of tensors; its main contribution is a theoretical and experimental analysis of how feature inclusion works in different concrete models of verb tensors. We present results for relational, Frobenius, projective, and holistic methods and compare them to the simple vector addition, multiplication, min, and max models. The degrees of entailment thus obtained are evaluated via a variety of existing word-based measures, such as Weed’s and Clarke’s, KL-divergence, APinc, balAPinc, and two of our previously proposed metrics at the phrase/sentence level. We perform experiments on three entailment datasets, investigating which version of tensor-based composition achieves the highest performance when combined with the sentence-level measures.

pdf bib
Compositional Distributional Models of Meaning
Mehrnoosh Sadrzadeh | Dimitri Kartsaklis
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Tutorial Abstracts

Compositional distributional models of meaning (CDMs) provide a function that produces a vectorial representation for a phrase or a sentence by composing the vectors of its words. Being the natural evolution of the traditional and well-studied distributional models at the word level, CDMs are steadily evolving to a popular and active area of NLP. This COLING 2016 tutorial aims at providing a concise introduction to this emerging field, presenting the different classes of CDMs and the various issues related to them in sufficient detail.

pdf bib
Robust Co-occurrence Quantification for Lexical Distributional Semantics
Dmitrijs Milajevs | Mehrnoosh Sadrzadeh | Matthew Purver
Proceedings of the ACL 2016 Student Research Workshop

2015

pdf bib
Proceedings of the 11th International Conference on Computational Semantics
Matthew Purver | Mehrnoosh Sadrzadeh | Matthew Stone
Proceedings of the 11th International Conference on Computational Semantics

pdf bib
A Frobenius Model of Information Structure in Categorical Compositional Distributional Semantics
Dimitri Kartsaklis | Mehrnoosh Sadrzadeh
Proceedings of the 14th Meeting on the Mathematics of Language (MoL 2015)

pdf bib
Concrete Models and Empirical Evaluations for the Categorical Compositional Distributional Model of Meaning
Edward Grefenstette | Mehrnoosh Sadrzadeh
Computational Linguistics, Volume 41, Issue 1 - March 2015

2014

pdf bib
Evaluating Neural Word Representations in Tensor-Based Compositional Settings
Dmitrijs Milajevs | Dimitri Kartsaklis | Mehrnoosh Sadrzadeh | Matthew Purver
Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP)

pdf bib
Resolving Lexical Ambiguity in Tensor Regression Models of Meaning
Dimitri Kartsaklis | Nal Kalchbrenner | Mehrnoosh Sadrzadeh
Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)

2013

pdf bib
Prior Disambiguation of Word Tensors for Constructing Sentence Vectors
Dimitri Kartsaklis | Mehrnoosh Sadrzadeh
Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing

pdf bib
Multi-Step Regression Learning for Compositional Distributional Semantics
E. Grefenstette | G. Dinu | Y. Zhang | M. Sadrzadeh | M. Baroni
Proceedings of the 10th International Conference on Computational Semantics (IWCS 2013) – Long Papers

pdf bib
The Frobenius Anatomy of Relative Pronouns
Stephen Clark | Bob Coecke | Mehrnoosh Sadrzadeh
Proceedings of the 13th Meeting on the Mathematics of Language (MoL 13)

pdf bib
Separating Disambiguation from Composition in Distributional Semantics
Dimitri Kartsaklis | Mehrnoosh Sadrzadeh | Stephen Pulman
Proceedings of the Seventeenth Conference on Computational Natural Language Learning

2012

pdf bib
A Unified Sentence Space for Categorical Distributional-Compositional Semantics: Theory and Experiments
Dimitri Kartsaklis | Mehrnoosh Sadrzadeh | Stephen Pulman
Proceedings of COLING 2012: Posters

2011

pdf bib
Concrete Sentence Spaces for Compositional Distributional Models of Meaning
Edward Grefenstette | Mehrnoosh Sadrzadeh | Stephen Clark | Bob Coecke | Stephen Pulman
Proceedings of the Ninth International Conference on Computational Semantics (IWCS 2011)

pdf bib
Experimenting with transitive verbs in a DisCoCat
Edward Grefenstette | Mehrnoosh Sadrzadeh
Proceedings of the GEMS 2011 Workshop on GEometrical Models of Natural Language Semantics

pdf bib
Experimental Support for a Categorical Compositional Distributional Model of Meaning
Edward Grefenstette | Mehrnoosh Sadrzadeh
Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing