impact of sample selection on in-context learning for entity extraction from scientific writing

Necva Bölücü, Maciej Rybinski, Stephen Wan


Abstract
Prompt-based usage of Large Language Models (LLMs) is an increasingly popular way to tackle many well-known natural language problems. This trend is due, in part, to the appeal of the In-Context Learning (ICL) prompt set-up, in which a few selected training examples are provided along with the inference request. ICL, a type of few-shot learning, is especially attractive for natural language processing (NLP) tasks defined for specialised domains, such as entity extraction from scientific documents, where the annotation is very costly due to expertise requirements for the annotators. In this paper, we present a comprehensive analysis of in-context sample selection methods for entity extraction from scientific documents using GPT-3.5 and compare these results against a fully supervised transformer-based baseline. Our results indicate that the effectiveness of the in-context sample selection methods is heavily domain-dependent, but the improvements are more notable for problems with a larger number of entity types. More in-depth analysis shows that ICL is more effective for low-resource set-ups of scientific information extraction
Anthology ID:
2023.findings-emnlp.338
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5090–5107
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.338
DOI:
10.18653/v1/2023.findings-emnlp.338
Bibkey:
Cite (ACL):
Necva Bölücü, Maciej Rybinski, and Stephen Wan. 2023. impact of sample selection on in-context learning for entity extraction from scientific writing. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 5090–5107, Singapore. Association for Computational Linguistics.
Cite (Informal):
impact of sample selection on in-context learning for entity extraction from scientific writing (Bölücü et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.338.pdf