DRAFT: Dense Retrieval Augmented Few-shot Topic classifier Framework

Keonwoo Kim, Younggun Lee


Abstract
With the growing volume of diverse information, the demand for classifying arbitrary topics has become increasingly critical. To address this challenge, we introduce DRAFT, a simple framework designed to train a classifier for few-shot topic classification. DRAFT uses a few examples of a specific topic as queries to construct Customized dataset with a dense retriever model. Multi-query retrieval (MQR) algorithm, which effectively handles multiple queries related to a specific topic, is applied to construct the Customized dataset. Subsequently, we fine-tune a classifier using the Customized dataset to identify the topic. To demonstrate the efficacy of our proposed approach, we conduct evaluations on both widely used classification benchmark datasets and manually constructed datasets with 291 diverse topics, which simulate diverse contents encountered in real-world applications. DRAFT shows competitive or superior performance compared to baselines that use in-context learning, such as GPT-3 175B and InstructGPT 175B, on few-shot topic classification tasks despite having 177 times fewer parameters, demonstrating its effectiveness.
Anthology ID:
2023.findings-emnlp.150
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2278–2294
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.150
DOI:
10.18653/v1/2023.findings-emnlp.150
Bibkey:
Cite (ACL):
Keonwoo Kim and Younggun Lee. 2023. DRAFT: Dense Retrieval Augmented Few-shot Topic classifier Framework. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 2278–2294, Singapore. Association for Computational Linguistics.
Cite (Informal):
DRAFT: Dense Retrieval Augmented Few-shot Topic classifier Framework (Kim & Lee, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.150.pdf