NB-MLM: Efficient Domain Adaptation of Masked Language Models for Sentiment Analysis

Nikolay Arefyev, Dmitrii Kharchev, Artem Shelmanov


Abstract
While Masked Language Models (MLM) are pre-trained on massive datasets, the additional training with the MLM objective on domain or task-specific data before fine-tuning for the final task is known to improve the final performance. This is usually referred to as the domain or task adaptation step. However, unlike the initial pre-training, this step is performed for each domain or task individually and is still rather slow, requiring several GPU days compared to several GPU hours required for the final task fine-tuning. We argue that the standard MLM objective leads to inefficiency when it is used for the adaptation step because it mostly learns to predict the most frequent words, which are not necessarily related to a final task. We propose a technique for more efficient adaptation that focuses on predicting words with large weights of the Naive Bayes classifier trained for the task at hand, which are likely more relevant than the most frequent words. The proposed method provides faster adaptation and better final performance for sentiment analysis compared to the standard approach.
Anthology ID:
2021.emnlp-main.717
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9114–9124
Language:
URL:
https://aclanthology.org/2021.emnlp-main.717
DOI:
10.18653/v1/2021.emnlp-main.717
Bibkey:
Cite (ACL):
Nikolay Arefyev, Dmitrii Kharchev, and Artem Shelmanov. 2021. NB-MLM: Efficient Domain Adaptation of Masked Language Models for Sentiment Analysis. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 9114–9124, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
NB-MLM: Efficient Domain Adaptation of Masked Language Models for Sentiment Analysis (Arefyev et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.717.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.717.mp4
Code
 nvanva/nb-mlm