Sonnet or Not, Bot? Poetry Evaluation for Large Models and Datasets

Melanie Walsh, Maria Antoniak, Anna Preus


Abstract
Large language models (LLMs) can now generate and recognize poetry. But what do LLMs really know about poetry? We develop a task to evaluate how well LLMs recognize one aspect of English-language poetry—poetic form—which captures many different poetic features, including rhyme scheme, meter, and word or line repetition. By using a benchmark dataset of over 4.1k human expert-annotated poems, we show that state-of-the-art LLMs can successfully identify both common and uncommon fixed poetic forms—such as sonnets, sestinas, and pantoums—with surprisingly high accuracy. However, performance varies significantly by poetic form; the models struggle to identify unfixed poetic forms, especially those based on topic or visual features. We additionally measure how many poems from our benchmark dataset are present in popular pretraining datasets or memorized by GPT-4, finding that pretraining presence and memorization may improve performance on this task, but results are inconclusive. We release a benchmark evaluation dataset with 1.4k public domain poems and form annotations, results of memorization experiments and data audits, and code.
Anthology ID:
2024.findings-emnlp.914
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15568–15603
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.914
DOI:
Bibkey:
Cite (ACL):
Melanie Walsh, Maria Antoniak, and Anna Preus. 2024. Sonnet or Not, Bot? Poetry Evaluation for Large Models and Datasets. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 15568–15603, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Sonnet or Not, Bot? Poetry Evaluation for Large Models and Datasets (Walsh et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.914.pdf