Exploring Precision and Recall to assess the quality and diversity of LLMs

Florian Le Bronnec, Alexandre Verine, Benjamin Negrevergne, Yann Chevaleyre, Alexandre Allauzen


Abstract
We introduce a novel evaluation framework for Large Language Models (LLMs) such as Llama-2 and Mistral, focusing on importing Precision and Recall metrics from image generation to text generation. This approach allows for a nuanced assessment of the quality and diversity of generated text without the need for aligned corpora. By conducting a comprehensive evaluation of state-of-the-art language models, the study reveals new insights into their performance on open-ended generation tasks, which are not adequately captured by traditional benchmarks. The findings highlight a trade-off between the quality and diversity of generated samples, particularly when models are fine-tuned on instruction dataset or with human feedback. This work extends the toolkit for distribution-based NLP evaluation, offering insights into the practical capabilities and challenges that current LLMs face in generating diverse and high-quality text.
Anthology ID:
2024.acl-long.616
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11418–11441
Language:
URL:
https://aclanthology.org/2024.acl-long.616
DOI:
Bibkey:
Cite (ACL):
Florian Le Bronnec, Alexandre Verine, Benjamin Negrevergne, Yann Chevaleyre, and Alexandre Allauzen. 2024. Exploring Precision and Recall to assess the quality and diversity of LLMs. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 11418–11441, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Exploring Precision and Recall to assess the quality and diversity of LLMs (Le Bronnec et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.616.pdf