An End-to-End Contrastive Self-Supervised Learning Framework for Language Understanding

Hongchao Fang, Pengtao Xie


Abstract
Self-supervised learning (SSL) methods such as Word2vec, BERT, and GPT have shown great effectiveness in language understanding. Contrastive learning, as a recent SSL approach, has attracted increasing attention in NLP. Contrastive learning learns data representations by predicting whether two augmented data instances are generated from the same original data example. Previous contrastive learning methods perform data augmentation and contrastive learning separately. As a result, the augmented data may not be optimal for contrastive learning. To address this problem, we propose a four-level optimization framework that performs data augmentation and contrastive learning end-to-end, to enable the augmented data to be tailored to the contrastive learning task. This framework consists of four learning stages, including training machine translation models for sentence augmentation, pretraining a text encoder using contrastive learning, finetuning a text classification model, and updating weights of translation data by minimizing the validation loss of the classification model, which are performed in a unified way. Experiments on datasets in the GLUE benchmark (Wang et al., 2018a) and on datasets used in Gururangan et al. (2020) demonstrate the effectiveness of our method.
Anthology ID:
2022.tacl-1.76
Volume:
Transactions of the Association for Computational Linguistics, Volume 10
Month:
Year:
2022
Address:
Cambridge, MA
Editors:
Brian Roark, Ani Nenkova
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
1324–1340
Language:
URL:
https://aclanthology.org/2022.tacl-1.76
DOI:
10.1162/tacl_a_00521
Bibkey:
Cite (ACL):
Hongchao Fang and Pengtao Xie. 2022. An End-to-End Contrastive Self-Supervised Learning Framework for Language Understanding. Transactions of the Association for Computational Linguistics, 10:1324–1340.
Cite (Informal):
An End-to-End Contrastive Self-Supervised Learning Framework for Language Understanding (Fang & Xie, TACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.tacl-1.76.pdf