Compositional Evaluation on Japanese Textual Entailment and Similarity

Hitomi Yanaka, Koji Mineshima


Abstract
Natural Language Inference (NLI) and Semantic Textual Similarity (STS) are widely used benchmark tasks for compositional evaluation of pre-trained language models. Despite growing interest in linguistic universals, most NLI/STS studies have focused almost exclusively on English. In particular, there are no available multilingual NLI/STS datasets in Japanese, which is typologically different from English and can shed light on the currently controversial behavior of language models in matters such as sensitivity to word order and case particles. Against this background, we introduce JSICK, a Japanese NLI/STS dataset that was manually translated from the English dataset SICK. We also present a stress-test dataset for compositional inference, created by transforming syntactic structures of sentences in JSICK to investigate whether language models are sensitive to word order and case particles. We conduct baseline experiments on different pre-trained language models and compare the performance of multilingual models when applied to Japanese and other languages. The results of the stress-test experiments suggest that the current pre-trained language models are insensitive to word order and case marking.
Anthology ID:
2022.tacl-1.73
Volume:
Transactions of the Association for Computational Linguistics, Volume 10
Month:
Year:
2022
Address:
Cambridge, MA
Editors:
Brian Roark, Ani Nenkova
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
1266–1284
Language:
URL:
https://aclanthology.org/2022.tacl-1.73
DOI:
10.1162/tacl_a_00518
Bibkey:
Cite (ACL):
Hitomi Yanaka and Koji Mineshima. 2022. Compositional Evaluation on Japanese Textual Entailment and Similarity. Transactions of the Association for Computational Linguistics, 10:1266–1284.
Cite (Informal):
Compositional Evaluation on Japanese Textual Entailment and Similarity (Yanaka & Mineshima, TACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.tacl-1.73.pdf
Video:
 https://aclanthology.org/2022.tacl-1.73.mp4