BERT-kNN: Adding a kNN Search Component to Pretrained Language Models for Better QA

Nora Kassner, Hinrich Schütze


Abstract
Khandelwal et al. (2020) use a k-nearest-neighbor (kNN) component to improve language model performance. We show that this idea is beneficial for open-domain question answering (QA). To improve the recall of facts encountered during training, we combine BERT (Devlin et al., 2019) with a traditional information retrieval step (IR) and a kNN search over a large datastore of an embedded text collection. Our contributions are as follows: i) BERT-kNN outperforms BERT on cloze-style QA by large margins without any further training. ii) We show that BERT often identifies the correct response category (e.g., US city), but only kNN recovers the factually correct answer (e.g.,“Miami”). iii) Compared to BERT, BERT-kNN excels for rare facts. iv) BERT-kNN can easily handle facts not covered by BERT’s training set, e.g., recent events.
Anthology ID:
2020.findings-emnlp.307
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3424–3430
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.307
DOI:
10.18653/v1/2020.findings-emnlp.307
Bibkey:
Cite (ACL):
Nora Kassner and Hinrich Schütze. 2020. BERT-kNN: Adding a kNN Search Component to Pretrained Language Models for Better QA. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 3424–3430, Online. Association for Computational Linguistics.
Cite (Informal):
BERT-kNN: Adding a kNN Search Component to Pretrained Language Models for Better QA (Kassner & Schütze, Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.307.pdf
Video:
 https://slideslive.com/38940173
Code
 norakassner/BERT-kNN
Data
LAMASQuAD