Unsupervised Domain Adaptation of Language Models for Reading Comprehension

Kosuke Nishida, Kyosuke Nishida, Itsumi Saito, Hisako Asano, Junji Tomita


Abstract
This study tackles unsupervised domain adaptation of reading comprehension (UDARC). Reading comprehension (RC) is a task to learn the capability for question answering with textual sources. State-of-the-art models on RC still do not have general linguistic intelligence; i.e., their accuracy worsens for out-domain datasets that are not used in the training. We hypothesize that this discrepancy is caused by a lack of the language modeling (LM) capability for the out-domain. The UDARC task allows models to use supervised RC training data in the source domain and only unlabeled passages in the target domain. To solve the UDARC problem, we provide two domain adaptation models. The first one learns the out-domain LM and in-domain RC task sequentially. The second one is the proposed model that uses a multi-task learning approach of LM and RC. The models can retain both the RC capability acquired from the supervised data in the source domain and the LM capability from the unlabeled data in the target domain. We evaluated the models on UDARC with five datasets in different domains. The models outperformed the model without domain adaptation. In particular, the proposed model yielded an improvement of 4.3/4.2 points in EM/F1 in an unseen biomedical domain.
Anthology ID:
2020.lrec-1.663
Volume:
Proceedings of the Twelfth Language Resources and Evaluation Conference
Month:
May
Year:
2020
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Asuncion Moreno, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
5392–5399
Language:
English
URL:
https://aclanthology.org/2020.lrec-1.663
DOI:
Bibkey:
Cite (ACL):
Kosuke Nishida, Kyosuke Nishida, Itsumi Saito, Hisako Asano, and Junji Tomita. 2020. Unsupervised Domain Adaptation of Language Models for Reading Comprehension. In Proceedings of the Twelfth Language Resources and Evaluation Conference, pages 5392–5399, Marseille, France. European Language Resources Association.
Cite (Informal):
Unsupervised Domain Adaptation of Language Models for Reading Comprehension (Nishida et al., LREC 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.lrec-1.663.pdf
Data
BioASQBookCorpusDuoRCMRQANatural QuestionsNewsQASQuAD