%0 Conference Proceedings %T Neural Retrieval for Question Answering with Cross-Attention Supervised Data Augmentation %A Yang, Yinfei %A Jin, Ning %A Lin, Kuo %A Guo, Mandy %A Cer, Daniel %Y Zong, Chengqing %Y Xia, Fei %Y Li, Wenjie %Y Navigli, Roberto %S Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers) %D 2021 %8 August %I Association for Computational Linguistics %C Online %F yang-etal-2021-neural-retrieval %X Early fusion models with cross-attention have shown better-than-human performance on some question answer benchmarks, while it is a poor fit for retrieval since it prevents pre-computation of the answer representations. We present a supervised data mining method using an accurate early fusion model to improve the training of an efficient late fusion retrieval model. We first train an accurate classification model with cross-attention between questions and answers. The cross-attention model is then used to annotate additional passages in order to generate weighted training examples for a neural retrieval model. The resulting retrieval model with additional data significantly outperforms retrieval models directly trained with gold annotations on Precision at N (P@N) and Mean Reciprocal Rank (MRR). %R 10.18653/v1/2021.acl-short.35 %U https://aclanthology.org/2021.acl-short.35 %U https://doi.org/10.18653/v1/2021.acl-short.35 %P 263-268