Improving Contextual Representation with Gloss Regularized Pre-training

Yu Lin, Zhecheng An, Peihao Wu, Zejun Ma


Abstract
Though achieving impressive results on many NLP tasks, the BERT-like masked language models (MLM) encounter the discrepancy between pre-training and inference. In light of this gap, we investigate the contextual representation of pre-training and inference from the perspective of word probability distribution. We discover that BERT risks neglecting the contextual word similarity in pre-training. To tackle this issue, we propose an auxiliary gloss regularizer module to BERT pre-training (GR-BERT), to enhance word semantic similarity. By predicting masked words and aligning contextual embeddings to corresponding glosses simultaneously, the word similarity can be explicitly modeled. We design two architectures for GR-BERT and evaluate our model in downstream tasks. Experimental results show that the gloss regularizer benefits BERT in word-level and sentence-level semantic representation. The GR-BERT achieves new state-of-the-art in lexical substitution task and greatly promotes BERT sentence representation in both unsupervised and supervised STS tasks.
Anthology ID:
2022.findings-naacl.68
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
907–920
Language:
URL:
https://aclanthology.org/2022.findings-naacl.68
DOI:
10.18653/v1/2022.findings-naacl.68
Bibkey:
Cite (ACL):
Yu Lin, Zhecheng An, Peihao Wu, and Zejun Ma. 2022. Improving Contextual Representation with Gloss Regularized Pre-training. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 907–920, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Improving Contextual Representation with Gloss Regularized Pre-training (Lin et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.68.pdf
Software:
 2022.findings-naacl.68.software.zip
Video:
 https://aclanthology.org/2022.findings-naacl.68.mp4
Data
SICK