Evaluating Structural Generalization in Neural Machine Translation

Ryoma Kumon, Daiki Matsuoka, Hitomi Yanaka


Abstract
Compositional generalization refers to the ability to generalize to novel combinations of previously observed words and syntactic structures.Since it is regarded as a desired property of neural models, recent work has assessed compositional generalization in machine translation as well as semantic parsing.However, previous evaluations with machine translation have focused mostly on lexical generalization (i.e., generalization to unseen combinations of known words).Thus, it remains unclear to what extent models can translate sentences that require structural generalization (i.e., generalization to different sorts of syntactic structures).To address this question, we construct SGET, a machine translation dataset covering various types of compositional generalization with control of words and sentence structures.We evaluate neural machine translation models on SGET and show that they struggle more in structural generalization than in lexical generalization.We also find different performance trends in semantic parsing and machine translation, which indicates the importance of evaluations across various tasks.
Anthology ID:
2024.findings-acl.783
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13220–13239
Language:
URL:
https://aclanthology.org/2024.findings-acl.783
DOI:
Bibkey:
Cite (ACL):
Ryoma Kumon, Daiki Matsuoka, and Hitomi Yanaka. 2024. Evaluating Structural Generalization in Neural Machine Translation. In Findings of the Association for Computational Linguistics ACL 2024, pages 13220–13239, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
Evaluating Structural Generalization in Neural Machine Translation (Kumon et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.783.pdf