Unified Representation for Non-compositional and Compositional Expressions

Ziheng Zeng, Suma Bhat


Abstract
Accurate processing of non-compositional language relies on generating good representations for such expressions. In this work, we study the representation of language non-compositionality by proposing a language model, PIER+, that builds on BART and can create semantically meaningful and contextually appropriate representations for English potentially idiomatic expressions (PIEs). PIEs are characterized by their non-compositionality and contextual ambiguity in their literal and idiomatic interpretations. Via intrinsic evaluation on embedding quality and extrinsic evaluation on PIE processing and NLU tasks, we show that representations generated by PIER+ result in 33% higher homogeneity score for embedding clustering than BART, whereas 3.12% and 3.29% gains in accuracy and sequence accuracy for PIE sense classification and span detection compared to the state-of-the-art IE representation model, GIEA. These gains are achieved without sacrificing PIER+’s performance on NLU tasks (+/- 1% accuracy) compared to BART.
Anthology ID:
2023.findings-emnlp.783
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11696–11710
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.783
DOI:
10.18653/v1/2023.findings-emnlp.783
Bibkey:
Cite (ACL):
Ziheng Zeng and Suma Bhat. 2023. Unified Representation for Non-compositional and Compositional Expressions. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11696–11710, Singapore. Association for Computational Linguistics.
Cite (Informal):
Unified Representation for Non-compositional and Compositional Expressions (Zeng & Bhat, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.783.pdf