How Far Can Pretrained LLMs Go in Symbolic Music? Controlled Comparisons of Supervised and Preference-based Adaptation

Deepak Kumar, Emmanouil Karystinaios, Gerhard Widmer, Markus Schedl


Abstract
Music often shares notable parallels with language, motivating the use of pretrained large language models (LLMs) for symbolic music understanding and generation. Despite growing interest, the practical effectiveness of adapting instruction-tuned LLMs to symbolic music remains insufficiently characterized. We present a controlled comparative study of finetuning strategies for ABC-based generation and understanding, comparing an off-the-shelf instruction-tuned backbone to domain-adapted variants and a music-specialized LLM baseline. Across multiple symbolic music corpora and evaluation signals, we provide some insights into adaptation choices for symbolic music applications. We highlight the domain adaptation vs. preserving prior information tradeoff as well as the distinct behaviour of metrics used to measure the domain adaptation for symbolic music.
Anthology ID:
2026.nlp4musa-1.5
Volume:
Proceedings of the 4th Workshop on NLP for Music and Audio (NLP4MusA 2026)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Elena V. Epure, Sergio Oramas, SeungHeon Doh, Pedro Ramoneda, Anna Kruspe, Mohamed Sordo
Venues:
NLP4MusA | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
27–32
Language:
URL:
https://aclanthology.org/2026.nlp4musa-1.5/
DOI:
Bibkey:
Cite (ACL):
Deepak Kumar, Emmanouil Karystinaios, Gerhard Widmer, and Markus Schedl. 2026. How Far Can Pretrained LLMs Go in Symbolic Music? Controlled Comparisons of Supervised and Preference-based Adaptation. In Proceedings of the 4th Workshop on NLP for Music and Audio (NLP4MusA 2026), pages 27–32, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
How Far Can Pretrained LLMs Go in Symbolic Music? Controlled Comparisons of Supervised and Preference-based Adaptation (Kumar et al., NLP4MusA 2026)
Copy Citation:
PDF:
https://aclanthology.org/2026.nlp4musa-1.5.pdf