First Tragedy, then Parse: History Repeats Itself in the New Era of Large Language Models

Naomi Saphra, Eve Fleisig, Kyunghyun Cho, Adam Lopez


Abstract
Many NLP researchers are experiencing an existential crisis triggered by the astonishing success of ChatGPT and other systems based on large language models (LLMs). After such a disruptive change to our understanding of the field, what is left to do? Taking a historical lens, we look for guidance from the first era of LLMs, which began in 2005 with large n-gram models for machine translation (MT). We identify durable lessons from the first era, and more importantly, we identify evergreen problems where NLP researchers can continue to make meaningful contributions in areas where LLMs are ascendant. We argue that disparities in scale are transient and researchers can work to reduce them; that data, rather than hardware, is still a bottleneck for many applications; that meaningful realistic evaluation is still an open problem; and that there is still room for speculative approaches.
Anthology ID:
2024.naacl-long.128
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2310–2326
Language:
URL:
https://aclanthology.org/2024.naacl-long.128
DOI:
Bibkey:
Cite (ACL):
Naomi Saphra, Eve Fleisig, Kyunghyun Cho, and Adam Lopez. 2024. First Tragedy, then Parse: History Repeats Itself in the New Era of Large Language Models. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 2310–2326, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
First Tragedy, then Parse: History Repeats Itself in the New Era of Large Language Models (Saphra et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.128.pdf
Copyright:
 2024.naacl-long.128.copyright.pdf