Beyond Positive Scaling: How Negation Impacts Scaling Trends of Language Models

Yuhui Zhang, Michihiro Yasunaga, Zhengping Zhou, Jeff Z. HaoChen, James Zou, Percy Liang, Serena Yeung


Abstract
Language models have been shown to exhibit positive scaling, where performance improves as models are scaled up in terms of size, compute, or data. In this work, we introduce NeQA, a dataset consisting of questions with negation in which language models do not exhibit straightforward positive scaling. We show that this task can exhibit inverse scaling, U-shaped scaling, or positive scaling, and the three scaling trends shift in this order as we use more powerful prompting methods or model families. We hypothesize that solving NeQA depends on two subtasks: question answering (task 1) and negation understanding (task 2). We find that task 1 has linear scaling, while task 2 has sigmoid-shaped scaling with an emergent transition point, and composing these two scaling trends yields the final scaling trend of NeQA. Our work reveals and provides a way to analyze the complex scaling trends of language models.
Anthology ID:
2023.findings-acl.472
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7479–7498
Language:
URL:
https://aclanthology.org/2023.findings-acl.472
DOI:
10.18653/v1/2023.findings-acl.472
Bibkey:
Cite (ACL):
Yuhui Zhang, Michihiro Yasunaga, Zhengping Zhou, Jeff Z. HaoChen, James Zou, Percy Liang, and Serena Yeung. 2023. Beyond Positive Scaling: How Negation Impacts Scaling Trends of Language Models. In Findings of the Association for Computational Linguistics: ACL 2023, pages 7479–7498, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Beyond Positive Scaling: How Negation Impacts Scaling Trends of Language Models (Zhang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.472.pdf
Video:
 https://aclanthology.org/2023.findings-acl.472.mp4