On the Relationship between Sentence Analogy Identification and Sentence Structure Encoding in Large Language Models

Thilini Wijesiriwardene, Ruwan Wickramarachchi, Aishwarya Naresh Reganti, Vinija Jain, Aman Chadha, Amit Sheth, Amitava Das


Abstract
The ability of Large Language Models (LLMs) to encode syntactic and semantic structures of language is well examined in NLP. Additionally, analogy identification, in the form of word analogies are extensively studied in the last decade of language modeling literature. In this work we specifically look at how LLMs’ abilities to capture sentence analogies (sentences that convey analogous meaning to each other) vary with LLMs’ abilities to encode syntactic and semantic structures of sentences. Through our analysis, we find that LLMs’ ability to identify sentence analogies is positively correlated with their ability to encode syntactic and semantic structures of sentences. Specifically, we find that the LLMs which capture syntactic structures better, also have higher abilities in identifying sentence analogies.
Anthology ID:
2024.findings-eacl.31
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
451–457
Language:
URL:
https://aclanthology.org/2024.findings-eacl.31
DOI:
Bibkey:
Cite (ACL):
Thilini Wijesiriwardene, Ruwan Wickramarachchi, Aishwarya Naresh Reganti, Vinija Jain, Aman Chadha, Amit Sheth, and Amitava Das. 2024. On the Relationship between Sentence Analogy Identification and Sentence Structure Encoding in Large Language Models. In Findings of the Association for Computational Linguistics: EACL 2024, pages 451–457, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
On the Relationship between Sentence Analogy Identification and Sentence Structure Encoding in Large Language Models (Wijesiriwardene et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-eacl.31.pdf