Entailment as Robust Self-Learner

Jiaxin Ge, Hongyin Luo, Yoon Kim, James Glass


Abstract
Entailment has been recognized as an important metric for evaluating natural language understanding (NLU) models, and recent studies have found that entailment pretraining benefits weakly supervised fine-tuning. In this work, we design a prompting strategy that formulates a number of different NLU tasks as contextual entailment. This approach improves the zero-shot adaptation of pretrained entailment models. Secondly, we notice that self-training entailment-based models with unlabeled data can significantly improve the adaptation performance on downstream tasks. To achieve more stable improvement, we propose the Simple Pseudo-Label Editing (SimPLE) algorithm for better pseudo-labeling quality in self-training. We also found that both pretrained entailment-based models and the self-trained models are robust against adversarial evaluation data. Experiments on binary and multi-class classification tasks show that SimPLE leads to more robust self-training results, indicating that the self-trained entailment models are more efficient and trustworthy than large language models on language understanding tasks.
Anthology ID:
2023.acl-long.772
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13803–13817
Language:
URL:
https://aclanthology.org/2023.acl-long.772
DOI:
10.18653/v1/2023.acl-long.772
Bibkey:
Cite (ACL):
Jiaxin Ge, Hongyin Luo, Yoon Kim, and James Glass. 2023. Entailment as Robust Self-Learner. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 13803–13817, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Entailment as Robust Self-Learner (Ge et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.772.pdf
Video:
 https://aclanthology.org/2023.acl-long.772.mp4