On Length Divergence Bias in Textual Matching Models

Lan Jiang, Tianshu Lyu, Yankai Lin, Meng Chong, Xiaoyong Lyu, Dawei Yin


Abstract
Despite the remarkable success deep models have achieved in Textual Matching (TM) tasks, it still remains unclear whether they truly understand language or measure the semantic similarity of texts by exploiting statistical bias in datasets. In this work, we provide a new perspective to study this issue — via the length divergence bias. We find the length divergence heuristic widely exists in prevalent TM datasets, providing direct cues for prediction. To determine whether TM models have adopted such heuristic, we introduce an adversarial evaluation scheme which invalidates the heuristic. In this adversarial setting, all TM models perform worse, indicating they have indeed adopted this heuristic. Through a well-designed probing experiment, we empirically validate that the bias of TM models can be attributed in part to extracting the text length information during training. To alleviate the length divergence bias, we propose an adversarial training method. The results demonstrate we successfully improve the robustness and generalization ability of models at the same time.
Anthology ID:
2022.findings-acl.330
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4187–4193
Language:
URL:
https://aclanthology.org/2022.findings-acl.330
DOI:
10.18653/v1/2022.findings-acl.330
Bibkey:
Cite (ACL):
Lan Jiang, Tianshu Lyu, Yankai Lin, Meng Chong, Xiaoyong Lyu, and Dawei Yin. 2022. On Length Divergence Bias in Textual Matching Models. In Findings of the Association for Computational Linguistics: ACL 2022, pages 4187–4193, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
On Length Divergence Bias in Textual Matching Models (Jiang et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.330.pdf
Data
GLUETrecQA