Back-Training excels Self-Training at Unsupervised Domain Adaptation of Question Generation and Passage Retrieval

Devang Kulshreshtha, Robert Belfer, Iulian Vlad Serban, Siva Reddy


Abstract
In this work, we introduce back-training, an alternative to self-training for unsupervised domain adaptation (UDA). While self-training generates synthetic training data where natural inputs are aligned with noisy outputs, back-training results in natural outputs aligned with noisy inputs. This significantly reduces the gap between target domain and synthetic data distribution, and reduces model overfitting to source domain. We run UDA experiments on question generation and passage retrieval from the Natural Questions domain to machine learning and biomedical domains. We find that back-training vastly outperforms self-training by a mean improvement of 7.8 BLEU-4 points on generation, and 17.6% top-20 retrieval accuracy across both domains. We further propose consistency filters to remove low-quality synthetic data before training. We also release a new domain-adaptation dataset - MLQuestions containing 35K unaligned questions, 50K unaligned passages, and 3K aligned question-passage pairs.
Anthology ID:
2021.emnlp-main.566
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7064–7078
Language:
URL:
https://aclanthology.org/2021.emnlp-main.566
DOI:
10.18653/v1/2021.emnlp-main.566
Bibkey:
Cite (ACL):
Devang Kulshreshtha, Robert Belfer, Iulian Vlad Serban, and Siva Reddy. 2021. Back-Training excels Self-Training at Unsupervised Domain Adaptation of Question Generation and Passage Retrieval. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 7064–7078, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Back-Training excels Self-Training at Unsupervised Domain Adaptation of Question Generation and Passage Retrieval (Kulshreshtha et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.566.pdf
Software:
 2021.emnlp-main.566.Software.zip
Video:
 https://aclanthology.org/2021.emnlp-main.566.mp4
Code
 McGill-NLP/MLQuestions
Data
MLQuestionsNatural QuestionsPubMedQAVQG