Syllable-level lyrics generation from melody exploiting character-level language model

Zhe Zhang, Karol Lasocki, Yi Yu, Atsuhiro Takasu


Abstract
The generation of lyrics tightly connected to accompanying melodies involves establishing a mapping between musical notes and syllables of lyrics. This process requires a deep understanding of music constraints and semantic patterns at syllable-level, word-level, and sentence-level semantic meanings. However, pre-trained language models specifically designed at the syllable level are publicly unavailable. To solve these challenging issues, we propose to exploit fine-tuning character-level language models for syllable-level lyrics generation from symbolic melody. In particular, our method aims to fine-tune a character-level pre-trained language model, allowing to incorporation of linguistic knowledge of the language model into the beam search process of a syllable-level Transformer generator network. Besides, by exploring ChatGPT-based evaluation of generated lyrics in addition to human subjective evaluation, we prove that our approach improves the coherence and correctness of generated lyrics, without the need to train expensive new language models.
Anthology ID:
2024.findings-eacl.89
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1336–1346
Language:
URL:
https://aclanthology.org/2024.findings-eacl.89
DOI:
Bibkey:
Cite (ACL):
Zhe Zhang, Karol Lasocki, Yi Yu, and Atsuhiro Takasu. 2024. Syllable-level lyrics generation from melody exploiting character-level language model. In Findings of the Association for Computational Linguistics: EACL 2024, pages 1336–1346, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Syllable-level lyrics generation from melody exploiting character-level language model (Zhang et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-eacl.89.pdf