Effective Unsupervised Domain Adaptation with Adversarially Trained Language Models

Thuy-Trang Vu, Dinh Phung, Gholamreza Haffari


Abstract
Recent work has shown the importance of adaptation of broad-coverage contextualised embedding models on the domain of the target task of interest. Current self-supervised adaptation methods are simplistic, as the training signal comes from a small percentage of randomly masked-out tokens. In this paper, we show that careful masking strategies can bridge the knowledge gap of masked language models (MLMs) about the domains more effectively by allocating self-supervision where it is needed. Furthermore, we propose an effective training strategy by adversarially masking out those tokens which are harder to reconstruct by the underlying MLM. The adversarial objective leads to a challenging combinatorial optimisation problem over subsets of tokens, which we tackle efficiently through relaxation to a variational lowerbound and dynamic programming. On six unsupervised domain adaptation tasks involving named entity recognition, our method strongly outperforms the random masking strategy and achieves up to +1.64 F1 score improvements.
Anthology ID:
2020.emnlp-main.497
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6163–6173
Language:
URL:
https://aclanthology.org/2020.emnlp-main.497
DOI:
10.18653/v1/2020.emnlp-main.497
Bibkey:
Cite (ACL):
Thuy-Trang Vu, Dinh Phung, and Gholamreza Haffari. 2020. Effective Unsupervised Domain Adaptation with Adversarially Trained Language Models. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 6163–6173, Online. Association for Computational Linguistics.
Cite (Informal):
Effective Unsupervised Domain Adaptation with Adversarially Trained Language Models (Vu et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.497.pdf
Video:
 https://slideslive.com/38939334
Code
 trangvu/mlm4uda
Data
FINWNUT 2016 NER