A Benchmark and Robustness Study of In-Context-Learning with Large Language Models in Music Entity Detection

Simon Hachmeier, Robert Jäschke


Abstract
Detecting music entities such as song titles or artist names is a useful application to help use cases like processing music search queries or analyzing music consumption on the web. Recent approaches incorporate smaller language models (SLMs) like BERT and achieve high results. However, further research indicates a high influence of entity exposure during pre-training on the performance of the models. With the advent of large language models (LLMs), these outperform SLMs in a variety of downstream tasks. However, researchers are still divided if this is applicable to tasks like entity detection in texts due to issues like hallucination. In this paper, we provide a novel dataset of user-generated metadata and conduct a benchmark and a robustness study using recent LLMs with in-context-learning (ICL). Our results indicate that LLMs in the ICL setting yield higher performance than SLMs. We further uncover the large impact of entity exposure on the best performing LLM in our study.
Anthology ID:
2025.coling-main.658
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9845–9859
Language:
URL:
https://aclanthology.org/2025.coling-main.658/
DOI:
Bibkey:
Cite (ACL):
Simon Hachmeier and Robert Jäschke. 2025. A Benchmark and Robustness Study of In-Context-Learning with Large Language Models in Music Entity Detection. In Proceedings of the 31st International Conference on Computational Linguistics, pages 9845–9859, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
A Benchmark and Robustness Study of In-Context-Learning with Large Language Models in Music Entity Detection (Hachmeier & Jäschke, COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.658.pdf